The Registrar switched from a paper to an online version of course evaluations during the 2008-09 academic year in order to make the process more efficient and manageable, University Spokesperson Martin Mbugua said in an email. The overall student response rate has been higher than it has been in the past, with 75 percent of students completing the online form for fall 2010. By comparison, a 2010 presentation by Associate Registrar for Reporting and Institutional Research Jonathan LeBouef reported that roughly 60 percent of students filled out course evaluations with the paper-based system.
Despite the higher response rate, faculty members have reacted differently to the transition from paper to web evaluations. Michael Littman, the undergraduate departmental representative for mechanical and aerospace engineering, said that in his experience he used to get feedback from 100 percent of the students he taught by giving out the evaluations at the final exam, but that number has since dropped to about 60 percent with the introduction of the online system.
“It weakens the feedback by not having a good response rate,” he said.
Economics professor Henry Farber GS ’77 said that the switch from a paper-based system to an online one may mean that reviews for large lectures do not accurately reflect the quality of the teaching. He explained that people who do not come to class are able to rate the class virtually.
Littman also said that he found the online system far more difficult to navigate. “I have to make probably 500 clicks to look through all my evaluations,” he said. “In the old days I’d sit down with a stack of paper, so it’s gotten more difficult to review them. I don’t review them with quite the same care as I did in the past.”
But even if the online system is more complex, it reduces the stacks of paper that Littman liked to scan. Cynthia Menkes, the electrical engineering undergraduate program coordinator, said she appreciates the online form. She noted that it is “not ratemyprofessors.com” and said that it makes her job easier, as she doesn’t have to coordinate and hand out piles of paper.
Mark Rose, the director of undergraduate studies in the molecular biology department, also said he prefers the online form and likes that it provides students online access to the evaluations as well. Unlike Littman, he said he has found that the online system has improved the response rate, giving greater balance to the overall tone of the evaluations.
“When people are unhappy, they’re more likely to write an evaluation,” he explained. “In the past it seemed as if evaluations were slanted negative. Now it’s more balanced overall.”
That is not to say some faculty members do not see problems with the current system. Rose said he thinks course evaluations could be better timed, explaining that the end-of-the-semester nature of the system prevents professors from incorporating feedback throughout the course of the semester.
“Some faculty, when they want feedback earlier, will hand out anonymous paper feedback mid-semester,” he said.
Menkes shared her view about the timing of the evaluations.
“A lot of faculty have mentioned in the past they would like evaluations done before exams. They feel that this way, it might provide for a truer feeling of the course without influence from exam grades.”

Some faculty said they saw the written comments as more important than the numbers. David Walker, the computer science departmental representative, took this view.
“I don’t really necessarily believe in a bunch of numbers or know how to calibrate them,” he said. “I think the numbers might be useful to the administration so they can keep track, but as an individual, I just pretty much ignore them, to be honest.”
Politics professor Kosuke Imai, who taught the introductory statistics course POL 345: Quantitative Analysis and Politics last fall, also said that he considers students’ written comments the most useful part of the SCORE evaluations. In an email, he remarked that student ratings across courses are “not necessarily comparable” because different students take different courses for different reasons. He said he found that 5 percent of students who filled out the course evaluations for POL 345 took the course because of an interest in statistics, while over 40 percent of students who filled out course evaluations for other classes in the politics department cited their interest in the topic as their reason for enrollment.
“Generally one would expect that large introductory courses and required courses will have a lower benchmark score than advanced small seminars taken as an elective,” he said. “Thus, course evaluations can only provide an imperfect picture of how well the courses are taught.”
While department chairs often use the data from the Registrar’s course evaluations to flag issues, make changes in courses and help evaluate the current staff, SCORE evaluations are also important for reasons students may not realize. Department chairs, program directors and the Faculty Advisory Committee on Appointments and Advancements frequently use course evaluations to make tenure and promotion decisions.
Mbugua said that department chairs use the evaluations in making recommendations for merit pay increases and that the Faculty Committee on the Course of Study may also use the evaluations in determining which courses to add to the curriculum.
Littman said the faculty who do particuarly well on the evaluations often receive a personal letter of congratulations from the department chair or from a dean. If their rating is low, professors might receive an inquiry from their chair. Letters should arrive around this week.
But for all of their value in future staffing and curriculum decisions, Emilia Simeonova, who is a visiting Wilson School professor from Tufts, said that the value of course evaluations lies in the here and now.
“We do look at these things. Everybody I know really tries to amend their class and their way of teaching based on what they saw in the latest set of evaluations,” Simeonova said. “For all the students who are involved in any evaluations in any classes, this is serious, and they are helping both themselves and the students coming after them.”
This is the second in a set of two articles about data from SCORE course evaluations and the role such feedback plays in academic life at Princeton.