Despite SCOTUS rulings, affirmative action still pertinent

MattHousiaux

After nearly fifty years as a standard institutional practice, affirmative action has recently encountered a great deal of opposition in high places.

Earlier this week, a 6-2 decision made by the United States Supreme Court upheld a 2006 Michigan state constitutional amendment banning all preferential treatment on the basis of race, ethnicity, gender or national origin.

This is only the most recent manifestation of what has become a sustained political backlash against policies of affirmative action—known pejoratively to its opponents as “reverse discrimination.”

Last year, this debate came to a head when a white student, Abigail Noel Fisher, sued the University of Texas, claiming that her rejection from the school was premised in her race rather than her academic qualifications.

Upon hearing Fisher’s case, the Supreme Court avoided a sweeping decision on the constitutionality of “race conscious” admissions criteria at public universities. But it did stipulate that achieving diversity should not be a motivating factor in such decisions.

This, then, begs the question: does the United States still need affirmative action? Or, as the Supreme Court portended when they overturned several key provisions of the 1965 Voting Rights Act, has the country moved beyond the need for affirmative action and other anti-discriminatory policies in business and education?

Historically, beginning with the mid-century victories of the African American and feminist civil rights movements, affirmative action was a generally accepted need.

During its inception, it even garnered (some) bipartisan support. The term “affirmative action” in its current use dates back to an executive order signed by President John F. Kennedy in 1961, which mandated that federally financed projects “must take affirmative action” to avoid racially biased hiring procedures.

Subsequent chief executives, including republicans Richard Nixon and Gerald Ford, continued to promote similar policies and, occasionally, even expanded them.

In 1979, Jimmy Carter issued another executive order that served notice to the need for government to support aspiring female entrepreneurs and women in business as a whole.

However, since the ‘80s, when the country awoke to Reagan’s New Morning in America, the perception of affirmative action as “preferential treatment” has come to hold more sway within many partisan circles.

The implication of this would seem to be both that affirmative action is wrong and that it is no longer needed as a means to address social inequities.

A myriad of factors would suggest otherwise.  Rates of school segregation have increased in the last ten years.

According to a recent government census, poverty rates remain highest among Native Americans and African Americans, even with African American participation in the economy spiking by 50 percent in the last two decades.

Additionally, women still earn only 77 cents for every dollar paid to men.

Moreover, it should be noted that the highest rates of opposition to affirmative action are, unsurprisingly, found among whites. A 2013 Gallup poll reveals that, although whites as a whole support affirmative action programs, when it comes to college admissions, a telling 75 percent believe such decisions should be decided solely on merit.

Inevitably, this evokes very legitimate concerns about who should be the ultimate authority on affirmative action.

Perhaps questions of need should be decided by those within the communities who stand to benefit from such policies, rather than those already situated in a position of privilege.

Matthew Housiaux is a sophomore journalism and history major from Brookings, S.D.