Follow us on Instagram
Try our free mini crossword
Subscribe to the newsletter
Download the app

Registrar to increase accessibility of course evaluations in fall 2011

Registrar Polly Griffin said the University is working to make teaching evaluations accessible directly from the Course Offerings website starting in fall 2011 at the Council of the Princeton University Community meeting on Monday. Griffin added that students may be able to ask course-specific questions directed toward individual department representatives in the near future.

The meeting was mostly devoted to a discussion about teaching evaluations and their effectiveness.

ADVERTISEMENT

A panel of seven members, including other administrators, faculty members and students, presented varying viewpoints on the topic.

“It’s important to recognize that there have been a couple of trends in the past 10 or 20 years in both the teaching profession and in course evaluations,” said Carol Porter, director of the McGraw Center for Teaching and Learning.

She noted in particular that the concept of evaluating courses has shifted from simply measuring the teaching ability of a professor to focusing on what students are able to learn and retain.

The new projects that are being explored by the Registrar’s Office were in-line with students’ desire for increased accessibility of information regarding courses, said USG Campus and Community Affairs Chair Stephen Stolzenberg ’13.

According to the USG’s recent “Which do you want more?” survey, students ranked “Give students more information about faculty before selecting classes (years taught, visiting, etc.)” fifth out of dozens of possible responses.

Roughly 43,000 votes registered from more than 1,300 distinct netIDs were compiled in the survey, Stolzenberg said.

ADVERTISEMENT
Tiger hand holding out heart
Support nonprofit student journalism. Donate to the ‘Prince’. Donate now »

However, psychology professor Daniel Oppenheimer said he was skeptical of the effectiveness of the University’s teaching evaluations and similar efforts in higher education, in measuring the educational benefit of specific courses to students.

“Can we learn anything from course evaluations?” he asked. “How well do we know how well we’ve learned? We’re not terrible ... but we’re not perfect.”

He said, for example, that students may value teaching practices that help their ability to cram rather than retain information in the long term because of the nature of exams and grading.

Oppenheimer added that many factors other than the quality of instruction correlate with evaluations, such as whether or not professors fidget with their hands.

Subscribe
Get the best of the ‘Prince’ delivered straight to your inbox. Subscribe now »

Porter described evaluations as “a necessary evil,” in that they did not necessarily give a complete picture of the educational experience in certain courses but were better than having no information.

Jed Marsh, vice provost for institutional research, presented a chart-based summary of aggregated teaching evaluation results at the University since the current online system of evaluations was first used in the fall of 2008.

Marsh noted that aggregate student responses to overall course evaluations have remained “remarkably steady” over the past five semesters, though a snapshot of the most recent semester reflected a range of different educational experiences among different groups at the University.

For example, among students taking humanities courses, 43 percent said that the class was “excellent” compared to only 26 percent of students taking natural sciences courses. The figures for those enrolled in engineering, social sciences and interdisciplinary courses were 29 percent, 33 percent and 40 percent, respectively.

University students and faculty who attended the meeting said the discussion was informative and an important element of dialogue at the University.

“I thought [the discussion] was interesting,” Shyam Modi ’14 said. “I’m a quantitative person, so I particularly enjoyed seeing the statistics about course evaluations. I’ve never seen statistics like that before.”

Philosophy professor Liz Harmon, who asked a question during the discussion about the University’s stance on monitoring comments from students that may be personally directed or offensive, said the discussion helped her to better understand, to an extent, the unwillingness of the University to censor the comments.

Parts of the panel discussion touched upon the existence of websites such as RateMyProfessors.com, which students might turn to in the absence of clear, useful and accessible information about courses at the University.

“I’m very sympathetic to the argument that, if the University doesn’t supply a way for students to express their thoughts on a course, there will be an external site,” Harmon said.

The next CPUC meeting will be held on May 2.