During last week’s Senate Assembly Committee on University Affairs meeting, Engineering Prof. James Holloway, the vice provost for global and engaged education, announced the University’s plan to release student course evaluation data via a newly developed, restricted-access website. The announcement dismayed some committee members, who don’t believe the evaluations in their current form provide valuable information and see no merit in publishing them for student use. While there’s no discernible harm in releasing the data as is, the administration must emphasize that course evaluations were created as a method of improving professor performance rather than a tool for scheduling classes.
The decision to release the course evaluation data came after prompting from the Central, LSA and Rackham student governments. While the information is currently available through Freedom of Information Act requests, it is difficult and time-consuming to access. Holloway and his colleagues are offering to expedite the process with a University-sponsored website that would feature evaluations for each respective class. The format is relatively simple: All of the data is quantitative — comments are not included in the report — and publication is limited only to professors who have taught longer than seven terms. Only those with a University uniqname will be able to access the data.
Some officials have expressed concern that the release of the data could have devastating effects on faculty who receive poor ratings, but because nothing is invasive or personal, this claim is questionable.
Given faculty concern over the usefulness of the evaluations, the University should invest its resources in the expertise of researchers to improve the survey to ensure it provides constructive feedback for both teachers and students. In the meantime, the University should release the current version of course evaluation results as long as it makes extremely clear the purpose of the collected data: It is not meant to replace other professor-ranking sites like ratemyprofessor.com. Only once the University frames course evaluations in the proper context and makes improvements could this be possible.
Flaws in the design of the evaluations aren’t the only problem. Instead of investing time and money into developing a separate website to publish the data, the University should work to create a central database for all course-related needs. Incorporating the data into Wolverine Access instead would simplify backpacking and planning by making all of the necessary resources readily available: Students could read syllabi, course descriptions and professor ratings all in the same place. A more in-depth evaluation coupled with a more integrated scheduling process could go a long way for students looking to make the most of their credit hours.
Course evaluations in their current form may not aid students in choosing classes, but releasing them should be a relatively painless process. If, however, the University is open to improving upon the course-evaluation system in the near future, it would be a welcomed step in the direction of a simpler scheduling method and, consequently, satisfied student and faculty bodies.