Advertisement

Who gets admitted to college?

The U.S. Supreme Court is expected to rule this month on the Fisher vs. University of Texas case, which is a lawsuit brought by a white student who was denied admission to the university and claims that she experienced racial discrimination. Above: Students walk through the University of Texas in Austin, which is among the most diverse college campuses in the country.
(Eric Gay / Associated Press)
Share

At Oxford University in England, admissions criteria are clear. As the admissions director there told me recently, what matters is an applicant’s potential to succeed in the subject she wants to study. A student wanting to study mathematics, say, must nail the math entrance exam, and in an interview show the potential to be an outstanding mathematician. Whether or not she is a concert violinist, the first in her family to go on to higher education, or the only female applicant in mathematics is irrelevant.

Oxford students believe in this system. They feel that the university is not responsible for making British society equal. But that isn’t how things have worked in the United States for nearly 100 years — and perhaps not ever.

The U.S. Supreme Court is expected to rule this month on a case involving U.S. college admissions that could lead to significant changes. The lawsuit was initially brought by Abigail Fisher, a white student who was denied admission to the University of Texas and claims that she experienced racial discrimination. But it’s important to understand the broader context of U.S. admissions policies when considering affirmative action.

Advertisement

VIDEO: Historic ruling set in affirmative action case

Although we would like to believe that universities have always admitted students based on some objective definition of merit, in fact admissions criteria have changed dramatically over time. Until the 1920s, elite universities in the United States administered university exams to determine the “best” students. But even then, the exams had their biases — for example, they usually contained material such as Latin that was taught only in elite schools.

The exam system began to change after Ivy League schools became alarmed at the number of Jewish students applying and scoring well on the tests. At that point, elite schools began to shift their definitions of “merit.” Columbia University led the way by introducing “character” as something to be considered in admissions decisions. This amorphous quality was said to include personality traits such as manliness and leadership ability. In order to judge character, the universities asked for photos and letters of recommendation and, in some cases, conducted interviews with applicants.

During the 1960s, this flexible understanding of merit started to shift. No longer used to exclude Jews, it began being used to address the notable underrepresentation of black students on campus.

VIDEO: Asian-Americans weigh in on affirmative action

Today, selective universities in the United States consider a range of attributes beyond academics in making admissions decisions. These include an applicant’s extracurricular activities, athletic prowess, hardships overcome, legacy status and race. In my research at elite American universities today, I have found that undergraduates are quite comfortable with these flexible notions of merit. They express a belief that racially diverse campuses are necessary to their training as future citizens and leaders in our globalized world. And they also support other examples of flexibility in admissions, including athletic recruiting and preferences for the children of alumni. Athletes, they argue, contribute to a fun campus life and demonstrate merit in athletics, while legacy admissions bring funds to the university that can potentially contribute to scholarships for more disadvantaged students.

Advertisement

Since long before Abigail Fisher was born, university admissions policies in the United States have been highly subjective, responding to the desires and needs of society and the academic institution itself. Race-based affirmative action is a part of the picture, and it symbolizes a deep commitment on the part of colleges and universities to the pursuit of racial justice in a country plagued by extreme racial inequality. And it’s hard to argue the programs are no longer necessary. Black Americans are still more than twice as likely to be poor as white Americans; black children are more likely to attend underperforming, racially segregated schools than white children; and whites with a criminal record are

more likely to receive a callback on job applications than blacks with no criminal record.

More than 50 years ago, British sociologist Michael Young coined the term “meritocracy.” He intended the term to have negative connotations, referring to a dystopia in which the elite use notions of merit to justify and maintain their status across generations. He portrayed a future in which promotion, pay and school admissions would be used to reward elites for their class-based cultural know-how rather than for qualities attainable by anyone in society.

If the Supreme Court ruling in the Fisher case bans the consideration of factors that promote racial equality and justice in admissions decisions, but allows universities to continue considering other kinds of non-academic “merit” that increase inequality, we will be one step closer to the kind of dysfunctional “meritocracy” Young envisioned.

Natasha Kumar Warikoo is an assistant professor at Harvard Graduate School of Education and the author of “Balancing Acts: Youth Culture in the Global City.” @nkwarikoo

Advertisement