Skip links

Truth Revealed: US News Rankings Mean Nothing–and Everything

Doug Lederer and the folks at Inside Higher Ed bring us a story today of Clemson University and how it manipulates data to help move itself up in the US News & World Report annual rankings.

These ranking bug me.  As an educational consultant, I am constantly having to explain that these rankings are at best imperfect measures of institutional quality and at worst amount to a completely misleading popularity contest.  Just today I am having to respond to a client’s whining that the universities I am suggesting are “too far down the league tables” (to which I might respond:  “so how come you didn’t think of that when you were preparing for your algebra final”).  My client doesn’t want to listen to any reasoned argument that the quality of education that an individual student received in the classroom has little or no bearing on the the league tables presented in US News.  Needless to say, it’s going to be an interesting day.

What has me looking up, however, is the fact that a former institutional researcher at Clemson gave a presentation at the Association of Institutional Research that revealed in great detail the strategies that Clemson administrators were using to lift Clemson in the US News rankings.  And the participants–other institutional researchers from other universities who also report data to US News–where shocked.  Shocked, I say.

You can read the article yourself:  Doug Lederer is a great writer and the article is balanced.

The funny thing (okay, well, it’s really not so hilarious–it’s more funny-peculiar) is that colleges and universities bash the ratings when they’re  down, and then post them on their websites anytime their name is mentioned favorably in those same rankings.

Everyone wants “proof” that the quality of their educational product is somehow good–better than their peers–more worthy of your consumer dollar.  Yet colleges and universities know that the measures developed by US News are flawed.  They know that they measure institutional inputs and not educational outcomes.  They know that statistics like student-to-faculty ratios are misleading.

Still, the college administrators slavishly report their data to the magazine editors–with or without manipulation or “influencing” the data.  They know it’s a stupid game, but they play it anyway.  And clearly Clemson has the rules down pat.

As Marcellus uttered in Shakespeare’s Hamlet, something is rotten–not in Denmark–but in American higher education.

I do understand this quest for some sort of evaluation system that will help us compare one college against the next so that we can make better decisions about which college is best for which kids.  But we don’t need US News & World Reports.  We need Consumer Reports.

Mark Montgomery
Educational Consultant

Technorati Tags: Tags:

Reader Interactions


  1. great article. nothing makes my blood boil more than when people say they go to a great school because of some superficial ranking. do people even think for themselves anymore?

  2. Thanks, Kirk. As you say, we should all make rankings for ourselves, based on what we think is important. Of course, we could make rankings on a number of issues or aspects of a college, and I suppose all the different ranking indices do just that. But when people neglect to think for themselves what aspects are most important to them, we run into the sort of superficiality you describe. I try hard to get kids (and parents) to think for themselves. Sometimes it works. Sometimes it doesn’t.

Leave a Reply

Your email address will not be published. Required fields are marked *