New School of Information Center for Social Media Responsibility looks to combat trolling and fake news

Tuesday, March 13, 2018 - 7:31pm

Students know it is almost impossible to visit a social media site today without coming across an example of “trolling” or “fake news,” terms that have become prevalent in the wake of the 2016 presidential election. The University of Michigan has positioned itself on the academic front of combatting low news literacy with courses on campus and online, and is now making a bigger institutional commitment. The School of Information opened the Center for Social Media Responsibility last week, aiming to create strategies assisting social media makers, consumers and platforms in fending off these “trolls” to make internet news outlets more credible.

Information School Dean Thomas Finholt said when he was a candidate for the dean position in 2016, one of his major platforms was that the faculty had a responsibility to help social media be more productive, which is the principal motivation for the center.

“The principal avenue to improve social media in my mind was to update the quality of public discourse so that it isn’t as corrosive and divisive as it has become,” Finholt said. “We’ve known, for over 25 years in some cases, a number of simple strategies that can be applied to make online conversations more sociable and less antagonistic, and it’s just a question of promoting those strategies and compelling the social media platform to adopt them.”

University alum Garlin Gilchrist, executive director of the program, intends to make Finholt’s dream a reality. He is from Detroit and has worked for Microsoft along with serving as one of Barack Obama’s social media managers during the 2008 presidential campaign.

Gilchrist said this experience showed him that making the change the center is aiming for is not just possible, but important.

“It really showed me what was possible really early on with people using a social network to connect with others, and it showed me the potential for that, and that is informing me today when I look at how information is spreading and all that kind of stuff online right now, and how it's going to change in the future,” Gilchrist said. “So that foundational experience for me showed that it is definitely possible and important to understand how people connect, how they converse, and how they engage.”

Gilchrist and the center are already working on strategies based on algorithms created by U-M researchers that could be used within the year. Gilchrist said the center was a response to an important set of questions about the way people receive information and how that is evolving.

“How reliable is that information?” Gilchrist asked. “How healthy is the environment? How can we measure the level of toxicity or personal attack or aggression in the conversation, and how can we use that research to make tools, to make a set of recommendations for social media makers, social media consumers, and for the social media platforms themselves so we can really make our experience online healthy and productive?”

The algorithms that have already been created are able to measure the level of aggression or toxicity in a particular conversation online, and the center intends on sharing this information with social media platform companies so they can make their websites friendlier. Finholt mentioned a small change made by the New York Times that was able to positively affect their platform and related it to what the center was trying to do.

“The New York Times did an experiment where instead of giving people the option of thumbs up or thumbs down, they gave them a third option, which was simply ‘respect,’” Finhold said. “So you didn’t have to say you hated something or you loved something; it could be something that you didn’t agree with, but you liked the way the person had said it. And that simple intervention made a huge difference in the quality of the comment thread, and tended to extinguish some of the trollish behavior that you usually see on those comment threads.”

Information graduate student Samuel Carton, who is working on machine learning for the new center, said he felt the hate spread on these sites has had a serious impact on the issues around politics in the U.S.

“Societally, we have this huge issue on social media where informal political engagement, among other kinds of interaction, is really hamstrung by a lack of stability, and by the prevalence of different kinds of harassment,” Carton said. “It really contributes to some of the problems with politics in this country, and it really drives people apart when you can’t have a political discussion online without it devolving into various forms of personal insult and other uncivil language.”

Finholt explained how, in the scheme of world problems, this was one in which an average student could make a significant impact, which is one of the reasons he tackling this issue as a responsibility of the School of Information.

“One of the key things is to recognize that there are many problems in the world that it can seem like whatever we do will make no difference,” Finhold said. “Recycling, or driving my car less, or taking one fewer flight, it may seem like that’s sort of a drop in the bucket. But with the behaviors around social media, particularly if we were to create norms around passing around information, there could be within the generation a profound normative shift where that kind of behavior starts to be shunned, kind of the way we feel about smoking in public or getting in the car and without buckling the seatbelt. For a large part, those transformations are normative and have to do with campaigns to fix people’s behavior, sometimes very small behaviors.”

Finholt said he believed students could have a large impact on this issue by doing small things.

“Particularly because people are so frustrated with the quality of the public discourse, I think it could be motivating to suddenly discover that there may be very small things you can do that have large consequences,” he said.