Design by Arunika Shee. Buy this photo.

By my senior year of high school, I had U.S. News’s top 25 undergraduate college rankings memorized by heart. The tied ranking of 25 of the University of Michigan and Carnegie Mellon was burned into my brain. Though there is a smorgasbord of different college rankings, U.S. News arguably holds the most sway over how applicants think about elite schools. I spent endless hours staring at the U.S. News’s list for one reason: it was something seemingly objective on which I could make my judgments. Deciphering and consolidating thousands of different opinions and viewpoints is almost impossible, but rankings offer a foothold from which to orient oneself.

Opinions of school quality are highly variable and often based on incomplete information. For instance, my dad has never considered the University of Chicago as being as prestigious as the other top 10 schools — even though it is ranked as the sixth best university in the country. Similarly, as immigrants from India, my parents’ knowledge of U.S. schools besides Harvard, Yale and Princeton is limited. Whatever the reason, the fact of the matter is that we all — to some extent — have different perspectives on U.S. colleges. And as a result, rankings like U.S. News help to bring some structure to a chaotic college admissions process. This is why I used to be a strong advocate for them.

As you can imagine, it is a very big deal that Columbia University’s national ranking on U.S. News recently dropped from second to 18th. As a well-known, prestigious university, the drastic drop in its ranking was something that shocked many. It was uncovered that Columbia submitted falsified data to U.S. News. Specifically, a math professor named Michael Thaddeus found discrepancies that made undergraduate class sizes seem smaller than they are, instructional spending looks bigger than it is and professors look more educated than they are. As the scandal unfolds, many are wondering the same questions: What actually goes into college rankings? Are the rankings truly reflective of the quality of a school’s education? Though for me this all began with shock surrounding Columbia’s new ranking, now I am only left with one thought: Is this system of rankings — that many of us once relied on — even truly reflective of a school’s quality of education?

To begin, let’s take a deeper look at what goes into determining a college’s ranking. The five factors are retention and graduation rates, faculty resources, expert opinion, financial resources, student excellence and alumni giving. At first glance, there are some factors that arguably do not make sense. For instance, measuring faculty resources comes down to how much money a school is able to allocate to its faculty, but I’d rather see how good those faculty members are at teaching. Additionally, it does not reflect the reality of higher education to have an entire category dedicated to expert opinion, which essentially equates to a measurement of prestige and reputation. “Experts” tend to be administrators at other institutions, who often have idiosyncratic views of what makes an undergraduate education worthwhile. Though I am sure that better faculty and financial resources could contribute to a better quality of education, at what point should we stop rewarding wealth? The cycle goes as such: by prioritizing a school’s wealth when quantifying their level of education, wealthy schools attract more students, and such schools become even wealthier; it is never-ending.

Another issue is that it is also very easy for colleges to submit falsified data. Though Columbia is currently the center of the scandal, falsification has been an ongoing problem; U.S. News itself regularly announces that it has found discrepancies in data submitted. Only last year, the former dean of Temple University’s business school was found to have used fraudulent data between 2014 and 2018 to score the top slot for their online MBA program. Just this year, the University of Southern California had to take its education school out of the rankings as a result of inaccuracies spanning five years. With a quantification method that could stand to be more equitable and colleges “cheating” their way to the top, what we are left with is a list that may or may not truly reflect how good a school is at teaching its students.

Besides the lack of enforcement around colleges submitting accurate information, another issue is the way that colleges are able to manipulate the system in their favor; the best example of this is Northeastern University. In 1996, Northeastern was ranked 162, and in 26 short years, it made an enormous jump to 44, moving up 118 spots. According to journalist Max Kutner, Richard Freeland — the former president of Northeastern responsible for its change — told him that he directed university researchers to “break the U.S. News code and replicate its formulas.” 

In practice, this meant enacting changes like smaller class sizes, hiring more impressive faculty and improving amenities. Rather than focusing on genuinely improving the quality of education and access to that education, when such a list takes over the lives of students and colleges alike, it can have a huge influence over what is prioritized. In my eyes at least, it seems unlikely that Northeastern’s focus on recruiting higher caliber high school students in order to boost their SAT scores and retention and graduation rates does much to actually improve the education being taught at the university.

I’m not going to sit here and write that college rankings should be abolished. Perhaps, in an ideal world, colleges wouldn’t be ranked and all that would matter is that you are getting an education; but this isn’t an ideal world, and at the end of the day, rankings do matter. Without them, there would only be more differing opinions, with a million uncategorizable data points. As such, I think the only way to move forward from here is for U.S. News to become more methodical with its rankings. This means holding colleges accountable for submitting accurate data, devising a better system for quantifying college rankings that doesn’t favor wealthier colleges and overall being more holistic in their deliberations. I am by no means an expert in these matters, but I think we can all agree that there are definitely issues, and when it comes to a list that influences the minds of millions, we should be working harder to get them fixed.

Palak Srivastava is an Opinion Columnist and can be reached at

Have thoughts about our pieces? The Michigan Daily is committed to publishing a diversity of Op-Eds & Letters to the Editor. Submission instructions can be found here.