For the first time in 12 years, Harvard alone claimed the top spot. In recent years, Princeton and Harvard had shared the number-one rank until Harvard dropped to second place in 2006.
The rankings, which consider admissions selectivity, financial resources, alumni giving rate and graduation rate, among other statistics, were released a week after Forbes Magazine published its own list of “America’s Best Colleges.” Forbes uses different criteria to judge schools. That list ranked Princeton first, Caltech second and Harvard third.
Ranking fever has certainly seized many high school students and their parents, but educators have expressed concern that these rankings are one-dimensional and over-simplistic.
Though institutions like Harvard, Yale and Stanford have stopped sending in peer reviews of other schools — which contribute 25 percent to the overall score and require academics to evaluate competing institutions — Princeton has not followed suit. When asked whether Princeton supports the U.S. News rankings, University spokeswoman Cass Cliatt ’96 gave a mixed response.
“Our general position on rankings is that we will respond to any group requesting public information,” Cliatt said, emphasizing that the University maintains that “no formulaic ranking can capture an institution’s individual distinctiveness.”
She added that individuals at the University who receive the survey decide whether or not to fill it out.
U.S. News sends peer review surveys to high-level administrators: presidents, deans and provosts.
The University’s position toward the rankings has been tame compared to some other education administrators’, such as those who have formed the Annapolis Group, which actively denounces the U.S. News’ ranking system.
Besides admonishing against using the rankings to pick a college, the Annapolis Group also claims that the hierarchies imposed by the dubious rankings mislead students and pressure schools to game the system.
This backlash is having an effect on participation in ranking surveys. A “President’s Letter,” issued in May 2007 by the nonprofit Education Conservancy, asked schools to refrain from filling out the peer review survey and to omit rankings from their promotional literature.
This year, 46 percent of schools filled out the peer review survey, down 5 percent from last year. In an interview with insidehighered.com, Robert Morse, who developed U.S. News’ scoring system, cited “survey fatigue” and the Higher Education Act as possible reasons for lower participation.
A different pecking order

On the other hand, the Forbes ranking system does not require schools and professors to fill out surveys. It claims a more student-oriented, results-based approach but offers a comprehensive ranking order.
The system was developed in conjunction with Dr. Richard Vedder, an economist at Ohio University, and the Center for College Affordability and Productivity. Vedder said that the system was developed with a team of 10 college students and relies more on student input than does the U.S. News system, which he argues is more based on “reputation.”
The Forbes calculations use student-written evaluations on ratemyprofessors.com, as well as a tally of alumni appearing in “Who’s Who in America.” Other factors include the average amount of student debt at graduation, the percentage of students graduating in four years and the number of students or faculty, adjusted for enrollment, who have won major awards like Nobel Prizes or Rhodes Scholarships.
In an interview, Vedder recalled receiving “hate mail” from irate readers “accusing him of being the devil incarnate.” The several dozen comments on Forbes Magazine’s web site express general dissatisfaction with the rankings. Readers are particularly perturbed by Forbes’ reliance on RateMyProfessors, viewed by some as a place for students to rant about their teachers or gush over those who give out all A’s.
The Princeton Review also publishes college rankings, but only in specific categories such as “extracurriculars,” “academics” and “parties.”
Breaking free
Because rankings rely on mostly public data, even colleges that boycott them can’t completely shake off their grip. Amherst College, a member of the Annapolis Group, was ranked number one among liberal arts colleges by U.S. News. The school no longer gives interviews on the topic of rankings as part of its no-promotion policy.
Amherst president Anthony Marx said in a statement last September that, while the school pledges non-participation in U.S. News’ peer review, “no degree of protest may make [rankings] soon disappear,” but there is hope that “further discussion will help shape them in ways that will press us to move in ever more socially and educationally useful directions.”
Even the schools that follow a policy of complete non-cooperation, like Reed College, can be ranked by U.S. News with help from the Department of Education and others.
But the national movement against college rankings is beginning to produce non-hierarchical resources for college-bound teens.
The Annapolis Group has worked with the National Association of Independent Colleges and Universities and the Council of Independent Colleges to create the University and College Accountability Network (U-CAN), which debuted in September 2007.
The site presents graduation rates, degree information and other facts in an unranked format. Princeton submits information to U-CAN, a gesture that Cliatt said demonstrates the University’s support of “transparency for students.”
But Steven Bell, an academic librarian at Temple University and a blogger who is critical of college rankings, said that though web sites like U-CAN are commendable, “just assembling the data and providing access to it is unlikely to convince rankings lovers to try something new. The packaging makes all the difference.”
“It’s very, very difficult to measure what colleges do, but people by their nature want the best that is available to them, which makes rankings useful,” Vedder said. “People like to compare apples with apples.”
Because college rankings are likely to remain imperfect and popular, Bell said, it is up to the students, parents and guidance counselors to discern the best choice. He suggested that they speak to “some real students, not the ones blogging on the admissions website.”
“Rankings make it all easy,” he said. “But the path of least resistance isn’t always the best one to take — especially when the next four years of your life and your future are at stake.”