Virginia Eubanks, author of “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor” spoke to students and faculty about the role of tech in policing poor communities Thursday afternoon.
Eubanks is an associate professor of political science at the University of Albany, SUNY. Her writing around technology and social justice has appeared in The American Prospect, The Nation, Harper’s and Wired. The School of Information, Poverty Solutions, and the Science, Technology, and Society program co-sponsored the event as part of the Ford School’s Policy Talks series.
Eubanks first acknowledged the risk taken by those whose stories are highlighted in her book and who relied on social services to meet their basic needs.
“Much of this work is made possible by their generosity and courage,” she said.
Eubanks talked about how the increased discretion of technology and algorithms in social services created the rise of a “digital poorhouse,” citing the history of poor houses in the United States during the 19th and 20th century. According to Eubanks, poorhouses were government-run facilities for the homeless and needy, and some people were required to give up their rights to be allowed access to their resources. Eubanks said Ann Arbor was home to a poorhouse on the land of the County Farm Park on Washtenaw Avenue.
“Dead bodies were given to the University for dissection if their families did not claim them within 24 hours,” Eubanks said. “So, this is one of the great fun things I get to do in every new town I go to: look up where your poorhouse was and what its story was.”
Eubanks continued this metaphor by discussing how poorhouses related to the legacy programming of digitized social services that collects information on those who use social welfare services. Eubanks said these digitized social services promote austerity — the idea that there aren’t enough resources to help everyone so we must choose who to help — and remove the emotional aspect from the decision of which poor people deserve support from the government.
“At their heart is this decision that we made back in the 1820s that public service programs are moral thermometers separating the deserving from the undeserving, rather than as a universal floor under everyone,” Eubanks said.
Eubanks said the decisions made by these algorithms affect real people, but she also stressed how social workers are more responsible for creating computerized lists instead of considering the families that are impacted.
Another problem Eubanks addressed is how data social workers use come from county and state public service programs. Eubanks said this is especially relevant to cases of Child Protective Services, where the working class are much more likely than their middle-class counterparts to be heavily surveilled.
“(The) limits of data set … sees risk and harm shaped by kinds of experiences poor and working-class people have with the state, (which creates) false positive problems, seeing risk of harm where no harm actually exists,” Eubanks said. “We believe these new digital tools are objective and neutral, but they actually just hide bias.”
To conclude the talk, Eubanks acknowledged how best intentions can still result in discrimination against the poor, espsecially with improper framing.
“The idea that we don’t have enough resources is a political choice,” Eubanks said. “We have to tell a different story about poverty. We tell the story that it is an aberration, but (the) reality is 51 percent of us will experience poverty. It is a majority condition.”
After the event, Public Health student Amanda Richman said the lecture touched on very broad topics but had a lot of worthwhile information and graphic representations to help understand Eubanks’ point.
“It was really interesting in general,” Richman said. “I was interested in how to use data and models responsibly and coming to learn Eubanks perspective. I think the questions she answered were really broad and I liked that.”
Sean de Four, a lecturer in the School of Social Work, said he came to the event because he was interested in the implications of using technology to track the general population.
“What people should pay attention to (are) the things that are being tested on working-class people and if they actually end up moving to affect everybody, so you can think about how they use data. I think that was really interesting,” de Four said.