70° Fort Worth
All TCU. All the time.

TCU 360

TCU 360

All TCU. All the time.

TCU 360

Statistics easily corrupted; fund students not programs

Americans love statistics – or at least numbers that give the appearance of statistics because, when interpreted and compiled into a concise form, it’s much easier to digest the information.Our favorite statistics are probably rankings. We rank everything from hotels to restaurants to sports teams to television shows to colleges.

Some of these rankings are appropriate, and, no matter what, they help give us some perspective on the worth of a particular establishment. But the difficulty comes with how these rankings are organized. If these ratings are generated by unreliable information, it flaws the whole system. This is especially detrimental to college students who want to attend competitive universities and so rely on ranking systems such as the U.S. News & World Report’s annual ranking. How exactly do they develop these ranks?

According to a Time Magazine article on April 2, the magazine uses mostly hard data. But the largest single factor in the rating system, comprising 25 percent of a school’s overall score, comes from a survey asking presidents, provosts and admissions directors to assess peer institutions. This process seems a little fishy for two reasons.

First, there is a great motivation to rank other schools poorly, or to at least give a negative review – in order to make your own institution look better and boost your rating.

Secondly, because these administrators are not actually at other campuses as much as their own, they only have a view from afar. Some schools complain this locks them into the same relative space on the ranking because of “decades-old impressions.”

The Time Magazine article emphasizes how the heads of a dozen private colleges are trying to gather up a group of about 570 small or midsize schools, asking these schools to stop participating in the U.S. News ratings. This would include not filling out surveys, not advertising their rank and helping to come up with a better set of relevant data as an alternative.

In an ideal world, this is a fantastic idea – and is much needed. Both schools and students place too much emphasis on the importance of these rankings. So many things that contribute to students’ successes cannot be measured with numbers and compiled into a statistic. Anecdotal evidence and testimonials should be the most important factor for students rather than a comparative number.

Because we don’t live in the perfect world, there are known roadblocks to these colleges’ plans. U.S. News & World Report has been known to corrupt data for schools who attempt to remove themselves from the system. Case in point: Reed College. In 1995, the magazine assigned the lowest possible score to their missing statistics and their ranking fell. Since then, the college has suffered no shortage of qualified applicants, according to Time Magazine.

TCU, as a small, private institution, could very well be on that list of 570 schools and just not know it yet.

If so, I think it may be wise to join a large force rebelling against the use of a tool that may no longer suffice. Yes, it is exciting to go to a school with high rankings – and we all appreciate the M. J. Neeley School of Business’s much advertised No. 11 rank by the Wall Street Journal. But it’s more important to devote time and energy to students than to fill out sometimes inconclusive surveys and boost money into the wrong places for the sake of a ranking.

Anahita Kalianivala is a freshman English and psychology major from Fort Worth. Her column appears Tuesdays.

More to Discover