At the University, course and teacher evaluations are deemed vital to both administration and faculty, but often neglected by the students who fill them out. The data itself is kept in-house, locked away at the Office of the Registrar, and the response rates are low compared to other universities.

However, for faculty, student evaluations can mean the difference between depositing a paycheck and dipping into emergency funds.

“Turns out there are pretty high stakes for us,” Political Science Prof. Mika LaVaque-Manty said. “(Evaluations) are involved in promotion, for GSI’s in terms of retention and for lecturers — who are judged purely on the basis of their contribution to teaching — they literally may be a matter of job or no job.”

When faculty members in line for promotion are assessed by their respective departments, student evaluations play a large role in the decision. Deborah Loewenberg Ball, dean of the School of Education, said student evaluations weigh heavily on the whole portfolio and influence the committee’s ultimate decision.

“You can’t get promoted at this University if you have bad teaching evaluations,” Ball said. “You could be a great researcher, you could be doing all types of things professionally, but if your course evaluations are poor, and there’s evidence that your teaching isn’t good, you won’t get promoted.”

However the data suggests students aren’t as invested in the process. In the 2012 fall semester, only 56 percent of all students responded to course evaluation surveys. For the past few years, the rate has been treading just slightly above 50 percent.

Two factors may explain the low response: the University’s transition from paper to electronic evaluations in 2008, and a lack of incentive for students to take the time to fill them out.

Before the 2008 winter term, evaluations were filled out on paper and administered in class, meaning all students attending had to submit to the survey. This led to a relatively high response rate, with 71.5 percent participation in the 2008 winter semester.

When the University moved the entire evaluation system online, they were moved out of the classroom. With students expected to fill the surveys out on their own time, response rates dropped 10 points, to about 61 percent in fall 2008.

While the rate has increased since 2009, the web survey has remained at about 15 to 20 percentage points below that of the paper version. Experts on such evaluations cite 70 percent as an ideal response rate. A lower rate will threaten reliability.

LaVaque-Manty said the golden number is not the rate, but the size of the class. In his research, he compared the evaluations of a specific course between consecutive years and found correlation, but only in large classes with more than 50 students. In smaller classes, evaluations fluctuated significantly year-to-year. A good rating one year could be followed by a poor rating the next. Teacher evaluations were not reliably measuring teacher quality in small classes, and a high response rate made no difference.

“Think about all those grad students teaching English 125. It’s 18 students and it’s a total crapshoot,” he said.

Of more concern is the emphasis a review committee places on such comparative data.

The faculty promotion guidelines, state that “comparative data is particularly helpful.”

However, when students click submit, they never again see their answers or those of their peers, and can forget it in light of looming finals. This contrasts many other colleges, where students feel compelled to thoroughly complete evaluations because they can also directly profit or potentially suffer consequences.

At Harvard University, students are given access to evaluation data through the Q-Guide, a list of every course offered — accompanied with comprehensive graphs, pie-charts and past student evaluations — to ease their “shopping” of courses and teachers.

“It’s worth it to take the time to fill it out,” Harvard sophomore Hannah Firestone said. “If you’re going to complain about a class, you should participate in the effort to give that class feedback, and if you really liked a class, you should participate in the effort to keep that class popular.”

As further motivation, Harvard withholds grades during a designated period to encourage participation in filling out course evaluations. The sooner students complete their evaluations, the sooner grades are returned.

Though she admitted coercion might not be the best enticement, Firestone said the higher response rates are worth it.

Northwestern University takes a slightly less involved approach. Instead of withholding grades, students who don’t fill out the surveys are denied access to evaluation data for the upcoming quarter.

Alison Phillips, assistant registrar at Northwestern, said that since the incentive was introduced in 2004, the student response rate has been a steady 70 percent each quarter.

In a December interview, University Provost Martha Pollack said the University was not in favor of using coercive methods to increase response rates.

“We don’t want to coerce students, but we want to encourage them to submit evaluations,” Pollack said.

LaVaque-Manty said the University has interpreted Michigan law in a way that bars it from withholding educational records. But Michigan State University, which is also under the jurisdiction of state law, withholds grades until students either fill out evaluations or decline via checkbox. MSU also publishes limited course evaluation data through a separate survey.

At the University, the administration has placed the responsibility on students to keep evaluation data public and up-to-date. Between 2003 and 2011 it was housed on a Central Student Government website, Advice Online, which is no longer in operation.

CSG President Michael Proppe, a Business senior, said he wasn’t directly involved in the site and was unsure exactly why it no longer exists.

“It’s either that the Registrar’s Office is not providing the data, or somewhere there’s a broken link, and whoever in CSG was responsible for keeping that data, either left or stopped doing their work and didn’t have a successor,” Proppe said.

Engineering senior Kyle Summers, former CSG chief of staff, was given control of Advice Online in 2009 and quickly realized much of the site was unhelpful and feared some data might be misrepresented.

“The interface looked really outdated,” Summers said. “I would say it was below par, relative to even our course guide right now, which could arguably use a lot of improvements.”

While CSG planned to launch before winter 2014 despite a new Advice Online in the works, Summers said that he doesn’t believe anyone is working on it as far as he knows.

Either way, LaVaque-Manty said it should not be the onus of students to publicize the data.

Without public evaluations, many students at the University turn to third-party course evaluation sites, such as RateMyProfessors.com.

The site depends solely on student contribution and, so far, students have input 3,714 faculty members and 349 campus ratings, which rate the school as a whole. Michigan State University students, in comparison, have input 1,333 faculty members, and 164 campus ratings.

The University ranks 10th in number of campus ratings among 4,564 schools on the site with at least one rating.

LaVaque-Manty, with the help of a graduate student, matched 700 professor ratings from RateMyProfessors with their respective University student evaluations and overlaid the results. To the surprise of critics, he found sufficient correlation between ratings of the two evaluation systems.

But his more startling finding came when he added the notorious RateMyProfessors chili pepper into the equation, which students award to teachers they deem attractive. While students may assume the chili is just a silly pepper, LaVaque-Manty’s research shows it might be a spicier indicator than students think.

“The professor who doesn’t have a chili pepper has to be almost as easy as the hardest professor who has a chili pepper to get the same quality rating,” LaVaque-Manty said.

Faculty with a chili pepper are more than likely have pretty good student evaluation scores, while those without a pepper may or may not have good scores.

His theory is that the chili is not just a measure of “hotness,” but also of rapport — a teacher’s emotional respect, empathy and consideration for the student.

“They might be Ryan Gosling or Jennifer Lawrence, but if they’re mean, you likely won’t give them a chili pepper,” LaVaque-Manty said.

Finding a first-rate solution to such a complex issue will not happen overnight, but the University maintains that it’s working on it.

University spokeswoman Kelly Cunningham wrote in a statement that “the University is exploring options for sharing course and instructor evaluation data in a central, easily accessible website.”

Leave a comment

Your email address will not be published. Required fields are marked *