On Thursday afternoon, approximately 50 students and faculty gathered in North Quad Residence Hall to hear Tarleton Gillespie, principal researcher at Microsoft Research New England and Cornell University affiliated associate professor, discuss content moderation in social media. The talk, which was hosted by the Center for Political Studies, School of Information and the Communication Studies department, was the final part of the “Ethics and Politics of AI” speaker series.

Gillespie began the discussion by saying content moderation needs to be reimagined. He said moderation of content on social platforms should be thought of as central, conditional and constitutional to platforms and their function.

“Rethinking is really difficult because, in part, moderation has appeared to be and has been narrated as something that is peripheral as to what platforms do,” Gillespie said. “We spent a long time thinking of moderation as a side project.”

He emphasized there should be more focus on the fact moderation does occur, despite this information being invisible to most users. Gillespie said this information is hidden by the idea that content is about community and participation.

“The easy point is that moderation is central,” Gillespie said. “All platforms moderate … there is still an existing belief that somehow platforms are unmoderated … That is false. All platforms are moderated, and they always have.”

When speaking about moderation as conditional, Gillespie noted a platform must be moderated to be considered a platform. He said this is because a platform by nature is curated, organized, archived and supervised.

Gillespie went on to highlight that moderation is unsolvable. He said this is in part because of the platforms themselves. Gillespie also noted the nature of moderation itself plays a big part in the insolubility of the moderation problem.

“The problem of moderation, as it’s being imagined, is seen as enormous, and that’s in part because these platforms are enormous,” Gillespie said. “The contrast between a kind of data-scale thinking and an intimate scale for how we experience moderation may be unresolvable.”

Gillespie mentioned current efforts to keep up with the large quantity of content and difficulty of moderation, which rely heavily on outsourcing. He also mentioned the work is often going on under unhealthy labor and psychological conditions.

“It’s not the 10 to 60 people sitting at the platform,” Gillespie said. “It’s an enormous set of people who are enlisted in the process of helping to decide what belongs and what doesn’t.”

Gillespie talked about automation of content moderation as a solution to large quantities of content. However, he disagreed with automation through AI because of the current limitations of technology.

However, he did say automation is a fundamental idea to successful moderation, though it would need to be an automation of the people employed as moderators.

“The dream is, if we’re going to have ten thousand people looking at content, we don’t have perfect adjudication, we don’t have justice, the best we can have is consistency,” Gillespie said. “But consistency requires getting all those people to act and think the same way.”

Gillespie said the goal is a procedural and mechanized solution, which can be enacted by people or software. The way to move forward, he said, is through the process of having the deliberation of what has public value and what should be moderated.

“If moderation isn’t just a problem of the big platforms, but is in actually every point in the flow of information,” Gillespie said. “We need a policy of a more networked notion of responsibility for a more network ecosystem.”

Rackham student Sriram Mohan was interested in the event because he used Gillespie’s book, “Custodians of the Internet,” as a text in a course for which he was a graduate student instructor last spring. He was interested in Gillespie’s discussion about how to get platforms to discuss deliberative moderation.

“His point is that, in the U.S., that has traditionally been the carrot and stick sort of model, where the stick is, ‘we will regulate you, the government will regulate you,’” Mohan said. “In post-colonial sort of democracies, the imagination of the state regulating there has a different sort of implication, and the sort of threat model has a very different valence there.”

Rackham student Padma Chirumamilla was interested in Gillespie’s points about the labor that goes in to moderation.

“I think this book in particular, and like that name, ‘Custodians of the Internet,’ evokes how much hidden and poorly compensated labor is, like, really necessary to make these platforms as barely functional as they are,” Chirumamilla said. “Even that rests on the backs of underpaid or minority workers, who have to go through some pretty terrible kinds of psychological burdens to keep these platforms running.”

Leave a comment

Your email address will not be published. Required fields are marked *