Every semester, I spend hours researching classes and professors before registering for the upcoming term. I read course guide descriptions and old syllabi, talk to friends or alumni who’ve taken the classes I’m considering, scroll through the lists of best and worst classes maintained by several student organizations I’m part of, read reviews of the professors and classes I’m considering on Rate My Professors, and ask favorite professors for their opinions on courses and colleagues in their department.

But information about some classes is hard to come by. For small classes or those with highly specific subject matter, I often find myself relying on the opinions of one or two people, which can cause problems because it’s almost impossible to tell whether those few students have given me a picture of the class or if they represent the minorities at either end of the spectrum who either loved or hated the class. In other words, do they fall in the red, yellow or green zones of this hypothetical bell chart about a hypothetical class that does not exist?

 

There is, of course, a fairly simple way to solve this problem — make course information available for every class. In late March, the University of Michigan released Academic Reporting Tools 2.0, a new tool for students that does just that. Well … kind of.

ART 2.0 provides basic information about the major, year and school distributions, as well as the pre-, co- and post-enrollment data of every student who’s taken the class in the past five years. It also gives their aggregated course evaluation responses to at least two questions: their desire to take the course and whether or not they learned a lot from the course. For some courses, the site also provides information on whether students thought the class had a heavy workload.

The University has been using ART internally since 2006, but, for the first time, in March 2016, they made certain data available to students. Releasing course evaluation data has been extremely controversial among faculty: the Faculty Senate voted in the fall to suspend releasing the information to students until it could draft policies to govern the information.

Physics Prof. Gus Evrard, who leads the ART 2.0 project, told the Daily, “Originally, we had some ideas of showing more information, and then in consulting with faculty colleagues we decided to step back … there were some trade-offs that we needed to make in order to get here.”

Despite the fact that access to any course evaluation data at all is a win for students — who pay to take these classes — the limited amount of information available through ART 2.0 substantially undercuts the site’s utility. The course evaluation data provided on ART 2.0 is exclusively quantitative and never provides information on specific professors — just classes.

The site includes aggregated answers to just two of the four questions asked on all evaluations. Strikingly, the omitted questions are the ones that ask students to indicate the degree to which they agree with the statements: “overall, this was an excellent course” and “overall, the instructor was an excellent teacher.” The site only reports the percentages of students who “had a strong desire to take this course” and “learned a great deal” from this course.

The site only provides one metric that assesses the quality of a course: the percentage of students who polled that they either “agree” or “strongly agree” that they “learned a great deal” from the course. For one thing, I’m not sure that I’ve ever not responded that I “learned a great deal” from a class on a course evaluation, partly because it’s a pretty vague, broad question.

Even if a professor graded unfairly, almost never engaged with students and aimlessly rambled during lectures, I think it would be hard not to learn a great deal from a course, as long as my prior exposure to the material was limited and I completed a majority of the required work. I could probably “learn a great deal” from a class just by doing the assigned readings and studying for the exams.

But, because “learned a great deal” is not defined in course evaluation questionnaires, other students may interpret that question differently. Some may respond according to whether they learned a great deal relative to what they learned from other courses. Still, others may give unfavorable answers if they disliked a class’s professor, even if they really did “learn a great deal.” Some students may associate learning with work and mark that they learned a great deal even if they really didn’t, just because the class kept them busy. Because it’s the only metric of its kind reported on ART 2.0, it’s almost impossible to infer what students really meant by their answers.

Qualitative data, like students’ comments to written course evaluation questions, would serve as a far better indicator of course quality. Written comments and course reviews would provide a more complete picture of what students did or didn’t like about a class, and why. Students have different preferences and learn differently. A lecture style that allows 85 percent of students to “learn a great deal” from a course might pose a huge barrier to learning for some students. That would be helpful to know before signing up to take the class.

The idea that qualitative information would be more useful for students is supported by the fact that — at least according to the survey I unscientifically conducted with 63 University undergraduates — students already express a preference for qualitative information about courses and professors.

Thirty-three percent of people surveyed responded that they are most likely to seek information about a class (or a class’s professor) from peers, older students or alumni, while 57 percent primarily use Rate My Professors, a site that offers both numerical rankings and scores on professor quality and written reviews about professors and classes. Given that ART 2.0 went live less than two weeks before I distributed the survey, I did not include it as an answer choice.

Eighty-four percent of students said they primarily use qualitative information like reviews when using sites like Rate My Professors. Students who talk to peers, older students or alumni about courses are almost certainly going to get qualitative information like a friend’s description of his or her experience with a class or professor.

This speaks to the main concern I have with the new ART 2.0 site: by only providing quantitative data — and not student comments about teaching quality or course material — the site fails to provide students with the information that they find most useful to their decision-making process.

There are, however, certain circumstances under which purely quantitative data could be useful — for example, for students selecting a required class taught by several different professors during the same term. In this scenario, rankings of different sections taught by different professors may facilitate easier comparison. However, because ART 2.0 doesn’t provide data by professor, it’s not even possible to see whether more students “learned a great deal” in Mitchell Dudley’s or Justin Wolfers’ section of Econ 101.

I understand that professors would be hesitant to support a website that makes it clear that, of all the professors who have taught sections of a class in the past five years, their students ranked them the lowest. But professors can’t prevent students from making enrollment decisions based on professors’ reputations among students. Given that many students will be able to find information about instructor quality by talking to older students or clicking through Rate My Professors, it seems that preventing students from accessing course evaluation data on teacher performance only lessens the chance that students’ enrollment decisions will be based on a representative sample of student opinion.

By incorporating more qualitative information and aggregating more information than exists through current sources, ART 2.0 could prove incredibly useful to students. The University has indicated that the user feedback is likely to shape future iterations of the site. I’ve used the site, and the main suggestion I have for its developers is simple: Provide students with the information most likely to help them make the best course decisions possible.

Besides, wasn’t that supposed to be their goal in the first place?

Victoria Noble can be reached at vjnoble@umich.edu.

Leave a comment

Your email address will not be published. Required fields are marked *