Automated text analysis tool will help students in large courses develop writing skills

University of Michigan's picture

ANN ARBOR—A University of Michigan program built on the premise that students enrolled in large courses learn more if they write—as opposed to only taking multiple choice tests to show mastery of content—will expand this fall to add automated text analysis to its digital toolkit.

M-Write helps students develop their conceptual learning and writing skills in large-enrollment gateway courses. It's another program in the university's expanding portfolio focused on personalized education.

The first goal of M-Write's use of automated text analysis (ATA) is to identify and prioritize the students who need help earlier on, said Chris Teplovs, lead developer at the Digital Innovation Greenhouse in the U-M Office of Academic Innovation.

In partnership with the Digital Innovation Greenhouse and funded by Third Century Initiative, the program is meant to give students in the College of Literature, Science, and the Arts, and College of Engineering STEM (science, technology, engineering and mathematics) classes a stronger understanding of course concepts that can be applied elsewhere.

Students learn through writing, getting feedback from fellows and faculty on their written pieces, and by conducting peer review of their classmates' papers.

The program uses writing fellows, students who previously excelled in their given classes. The fellows help students develop initial drafts, make revisions, give feedback to others and utilize the peer feedback they receive.

In fall 2017, automated text analysis will be used in Statistics 250 to predict the overall score from a student's piece of writing.

After students write and submit their essays, ATA is used to evaluate the essay, looking for the qualities of good writing that have been built into the algorithm. These qualities are examined using a variety of text analysis techniques, such as vocabulary matching or topic matching, which the algorithm detects. ATA generates a predicted score which is sent to ECoach, a personalized education program, for a writing fellow to verify. After this verification, the score will be made available to students.

"The writing fellow will serve as a checkpoint between [ATA] and the students," said Dave Harlan, principal developer of M-Write at the Digital Innovation Greenhouse.

ECoach will send messages about what makes for a good peer review and what makes a good revision of an essay, and by adding it to the to-do list that is built into the coaching program.

The ATA will identify which components of a given rubric the essay submission is strong or weak in. For each section of the rubric, ATA will generate a numeric score, and each of these numeric scores contribute to the overall predicted score.

"We're hoping to provide a second rater on the essay, providing that other set of eyes on an essay," Teplovs said. "The verification to us is very interesting because it not only gives the human graders a moment to pause and reconsider their assessments, but more importantly it provides a direct feedback loop to the algorithm development and allows us to creates a better one."

Not only does M-Write draw from a writing-to-learn pedagogy, but it comes with a potential for graders and professors to identify areas of improvement within their teaching.

With one of the prompts for a course, the doctoral students analyzing the essays noticed a jump in essay quality between semesters. Based on the professor's review of the first set of essays, he modified his approach to teaching the topics.

"Overall our goal is to improve student learning and a corollary of that is improving teaching," Teplovs said.

M-Write was created by Anne Gere, director of the Sweetland Center for Writing and professor of English language and literature and education; and Ginger Shultz, assistant professor of chemistry.

Copy this html code to your website/blog to embed this press release.