Op-Ed

Why don't college rankings measure student satisfaction?

When it comes to picking colleges, Americans are terrible consumers. Students hear rumors from friends about which of the nation’s 4,000-plus colleges and universities are the “best.” Worse, they depend on U.S. News and World Report-type rankings that universities can manipulate without improving the quality of the education provided. The average American has better access to information about $60 coffeemakers than $60,000 a year universities.

A new Gallup poll, though it doesn’t specifically address college choice, highlights that students applying to college don’t seem to know what kind of higher education will bring them the most satisfaction. They pay for their ignorance not only in debt, but in longstanding regrets.

The poll, unveiled earlier in June, asked 95,000 randomly selected people with various levels of education — from college dropouts to PhDs — to reflect on their college experience, including whether they wished they’d attended a different institution.

The results were sometimes surprising. Twenty-three percent of people who make $250,000 or more said they would choose a different college if they had it to do all over again. That was less than the 31% at the extreme edge of poverty — but not all that much less.

People who had attended pricey private colleges were barely more satisfied with their choices than those at public schools, even though public universities generally have bigger class sizes and fewer amenities. And those who attended selective colleges were only somewhat less likely to have regrets than those who went to schools where the bar for entry is far lower. Large amounts of student debt, predictably, made people much less satisfied with their choice of a college.

Perhaps outcomes are so bad because college rating systems, which have proliferated in recent years, never address this most basic of consumer questions: Are the buyers — the students — happy with their choices down the line?

Ranking systems tend to value prestige even though the Gallup poll indicates that might not bring students satisfaction down the road. The formula used by U.S. News, for example, depends heavily on surveys of academics; in other words, professors and administrators rating each other’s colleges, a fairly meaningless popularity contest among a rarefied group who might have very different priorities than students.

Professors’ salaries are used as a proxy for instructional excellence. And U.S. News and other rankings continue to rely too much on selectivity as a measure of excellence: student SAT scores, acceptance rates and the like.

Forbes, which prides itself on measuring outcomes, uses salaries after graduation as the biggest indicator of quality. But as the Gallup poll shows, a hefty portion of even the highest-paid Americans express dissatisfaction with their college experiences. And what about the people who enter social work, teaching, advocacy, nonprofit work, the arts — careers that don’t pay notably well? Their paychecks don’t make them any less successful.

Bashing college-rankings systems has long been popular — and justified — because they tend to tell you more about what sorts of high school students gain admission and how much money the institution has in the bank than whether students have a worthwhile experience. In the rankings’ defense, though, information about the most meaningful factor — student satisfaction over the long haul, as measured by randomized polling — isn’t available.

It should be. Colleges should poll their own students and alumni about their educational experiences on a regular basis. And so that the results can be compared from one school to another, the questions and methodologies should be standardized across schools. Fewer cash-strapped students would attend private schools if they knew they were about as likely to be satisfied with a public university at less than half the price.

Brandon Busteed at Gallup suggested that the nation’s accrediting agencies could require such polling each time that re-accreditation time rolls around, every several years. That’s enough; despite what U.S. News would have you believe, colleges don’t change much from one year to the next.

Of course, it takes more than just a consumer poll to create a useful ranking; otherwise every student who wanted to whine about a bad grade could hold colleges hostage. But instead of using proxies for students’ educational experience — professors’ salaries, class sizes and the like — colleges should go straight to the source.

Karin Klein writes about education for The Times editorial board.

Follow the Opinion section on Twitter @latimesopinion or Facebook

ALSO

If Trump were really in league with Russia, that would be reassuring for America's civic sanity

Trump is setting up his generals as fall guys for Afghanistan

Is Trump mentally fit to be president? Let's consult the U.S. Army's field manual on leadership

Copyright © 2017, Los Angeles Times
73°