Editorial: Students lose out in university numbers game


If you thought the deluge of holiday catalogs and charitable solicitations this season was overwhelming, consider what high school seniors confronted this fall: hundreds of mailers from colleges and universities suggesting that they apply and implying they might have a shot, even if they haven’t met the schools’ high standards.

Why so much marketing? It is largely the result of the college rankings compiled by publications, most notably U.S. News and World Report, that offer extra weight in their listings to schools with low “admit rates” — those that offer admission to relatively few of the students who apply. There was a time when this sort of selectivity may have been an indicator of actual educational excellence, at least in part. But thanks to the rankings-driven race among colleges to appear increasingly choosy, it’s no longer so clear what the admit rate means.

Schools are now lowering their admit rate by inveigling more students into applying — thus the shower of mailers, as well as hundreds of emails and the occasional telemarketing call. And it works, to the detriment of parents’ wallets. Today, partly because of all the marketing and recruitment, students are applying to about twice as many colleges as they did 15 years ago. As admission rates have dropped to as low as 5% among the most elite colleges, students have applied to even more of them. It’s no longer very unusual for a student to file applications to 15 schools, at $80 or so a pop. (Though a few colleges are upping the number of applicants further by making the process free and pushing their deadlines later.)


Colleges wanted this — but also are paying for it. By cajoling more students into applying in order to lower their admit rate, they’re inadvertently threatening their “yield rate” — the proportion of students who are accepted who ultimately decide to attend the school. That number is used in some of the rankings because, like the admit rate, it is also a supposed sign of desirability. (U.S. News dropped the yield rate from its calculations a decade ago in response to complaints.)

To address the yield rate problem, admissions offices these days are hiring expensive specialists to gauge whether applicants will say yes — and they factor that into the decision to accept or reject. They look for signs of “demonstrated interest” in the school, including whether the applicants have visited, how many times they’ve contacted the admissions office, whether they’ve reached out to faculty. This works against students who apply to many schools; it’s time-consuming to demonstrate interest in all of them. What’s more, it means that some students are turned down by schools even though they’re qualified and may want to attend.

The rankings have become far too powerful, and schools have allowed it to happen. No doubt schools believe many students decide which college to attend based on the rankings, but it is a shame that they have been willing to tailor their admissions criteria and processes in response.

Selectivity and yield, which had always been indicators of popularity, now indicate which colleges spend lavishly on yield experts and marketing. Of all the factors that college rankings take into account, these are the least useful for picking a good school, and the most damaging to students and the admissions process.

Turning this around will take a near-revolution among college leaders and families. High school counselors have been telling students for years that their happiness in college and their future success will depend more on finding the right fit than in responding to the glossy brochures or the magazine rankings. Students and parents need to start listening, because it’s true.

As for the schools, they need to be brave. It would be easiest for the top-ranked colleges, including the richest, most famous Ivy League schools, to start the ball rolling. They have the least to lose because their reputations are solid and they’ll always draw plenty of students. They could demand that rankings publications at the very least eliminate these two counterproductive measurements, vowing not to provide any information at all for the rankings if that doesn’t happen. If those colleges and universities led the way, others might muster the courage to follow.


Follow the Opinion section on Twitter @latimesopinion