M-Write expands to include computer analysis in grading student essays

.

File Photo/Daily

 

Sunday, June 4, 2017 - 7:18pm

This fall, students in Statistics 250 will be some of the first to utilize M-Write’s new expansion — an automated text analysis tool that can predict a student’s writing score.

In large courses, M-Write is used to assist students in their understanding of the material. Ginger Schulz, one of the developers of M-Write and the new ATA tool, emphasized the importance of writing in such large classroom settings.

“We began to gather evidence that it was effective in helping students to understand concepts more deeply and to connect them to other concepts,” Schulz said.

However, there are issues with implementing so much writing in large courses such as Economics 101 and Biology 174 — professors don’t have the time to thoughtfully critique 300 or more pieces of students' writing. That's where the automated text analysis tool, if effective, would come into place.

Using an advanced algorithm, the ATA can use techniques such as vocabulary matching or topic matching to identify strong and weak components of a student’s understanding of the course material.

“We will be using automated text analysis to identify particular features of writing that provide information about what the student does or doesn't understand," Schulz said. "Once we identify a subset of students that have a particular misconception about a topic, we can arrange for an instructor to send them a message that says, 'We noticed you wrote this about this topic — have you considered this?'"

The ATA aims to add another level of review for student’s writing. After it generates an automatic predicted score, the student work will be sent to ECoach, a personalized education program. Then a writing fellow, a student who has previously done very well in the course and assists in reviewing the assignment, will verify the ATA score.

Anne Gere, director of the Sweetland Center for Writing and one of the developers of the ATA tool, wrote in an email that identification of problem areas was one of the biggest benefits of the ATA.

“Both students and instructors benefit when the ATA can identify gaps in learning before the exam," she wrote. "When the ATA is fully functional it will provide feedback to both students and faculty; students will know what they need to study, and faculty will know what to emphasize in lectures.”

The University of Michigan is one of the first institutions to implement such a program. Michigan State University has a program called Automated Analysis of Constructed Response, which analyzes very short pieces of writing — around one to two sentences — but not longer, more involved works.  

“To my knowledge we are the first to do this with much longer pieces of writing that contain multiple ideas, which are synthesized into a coherent narrative,” Schulz says.

LSA sophomore Elaine Chamberlain, who recently took Statistics 250, was interested by the idea of implementing more writing in the class this fall and was especially intrigued by the idea of using Automated Text Analysis.

“I do think that writing helps to develop deeper and more critical thinking and can definitely provide a more well-rounded education," she said. "I think writing is something that is so subjective and everyone has their own style and voice, so I think that having human review is more accurate when it comes to gauging essays, but I get where the idea is coming from, and I can see how it would be helpful to start developing more advanced writing skills on a rudimentary level.”