Support the ‘Prince’

Please disable ad blockers for our domain. Thank you!

On Sept. 12, 2017, U.S. News and World Report released its annual Best College ranking lists for 2018. For the seventh straight year, Princeton has topped these rankings. But what — if anything — should we as an institution be proud of?

For Adam Conover — host and executive producer of the show “Adam Ruins Everything” on truTV — these rankings aren’t to be trusted. In a segment posted to the show’s YouTube channel this August, Adam claims that these flawed rankings have “rewarded schools that lie, cheat, and manipulate the system,” citing U.S. News and World Report's inception as a “popularity contest” in 1983 and its current use of a complex formula with subjectively weighted categories.

Has the University cheated its way to the top of the rankings? Given the emphasis that the University's faculty and students place on integrity, including through the Honor Code, I personally believe it’s unlikely; however, there is certainly incentive to do so.

To better understand the significance of U.S. News and World Report’s college rankings, we must explore its methodology. According to the organization’s website, U.S. News collects data from each school that it ranks “in up to 15 areas related to academic excellence.” The Best College rankings assess academic quality using statistical indicators from seven general categories in particular: “first-year student retention and graduation of students, peer assessment, faculty resources, admissions selectivity, financial resources, alumni giving, and graduation rate performance, which is the difference between the proportion of students expected to graduate and the proportion who do.” Because U.S. News only evaluates academic quality, nonacademic considerations — like campus safety, athletics, and access to housing — are not factored, according to the organization's website.

These school-specific data are then entered into a formula with weighted categories, based on U.S. News’s own judgments about the significance of each measure of quality. Colleges are then ranked using the overall numerical output from this formula.

Change to a school’s rankings from one year to the next happens for one of two reasons — either the school has “improved” in one of the seven categories mentioned above, or a slight change in the formula works in favor of that school. Per U.S. News, refinements in the methodology are made for just one reason: improvement. Following the ongoing debate on education quality metrics closely, U.S. News considers and implements better ideas as they arise. For example, the rankings have recently “put far less emphasis on input measures of quality — which look at characteristics of the students, faculty, and other resources going into the educational process — and more emphasis on output measures, which look at the results of the educational process, such as six-year graduation and first-year student retention rates.” This change has paralleled the greater emphasis put on results by educators, researchers, and policymakers when assessing the quality of educational programs.

But where exactly does the information on each school come from? According to U.S. News, an extensive questionnaire is sent each year to all accredited four-year colleges and universities to be filled out. This process of self-assessment undoubtedly creates space for unmonitored dishonesty. For example, when submitting these questionnaires, schools looking to benefit undeservedly could fabricate their responses in order to receive a boost in the rankings.

Are the contents of this questionnaire kept in mind as the University Office of Admissions carefully selects students to admit? As the Office of Alumni Affairs plans events and solicits donations? As the Undergraduate Financial Aid Office decides how to allocate its financial resources? If we understand the methodology behind these rankings, what’s stopping us from tweaking our practices to produce the greatest results?

It’s clear how schools that cheat or manipulate the system can easily benefit in the rankings, but what’s incentivizing these behaviors? In his video segment, Conover mentions that schools are under intense pressure to keep their rankings up because if they fail to do so, schools' number of applications received, research funding, and alumni donations may all decrease considerably.

Generally, the U.S. News Best College rankings carry as much weight as we give them; nonetheless, until we all collectively see through the subjective methodology and ease of manipulation, the rankings will — for better or for worse — retain some level of significance for universities across the country. Rightfully or not, the University sits at the top and we’re certainly reaping the benefits.

Jared Shulkin is a sophomore from Weston, Fla. He can be reached at

Comments powered by Disqus