These past few months, The Michigan Daily readers have probably become a bit more mindful when using Facebook. Following the Cambridge Analytica scandal, people were outraged. So, we started clicking away on our profiles’s privacy settings, congratulating ourselves on our newly found concern for the data collected on us and maybe even stopped liking things so that advertisers wouldn’t know our interests. Indeed, Facebook did a lot of outrageous things with our data, like allowing housing advertisers to target their ads to only white people and employers to advertise jobs exclusively to young people. We might argue that our beloved social media platforms will slowly wash away their sins as journalists discover and report the misuses of their algorithms — seemingly holding Facebook and the like accountable. Is that the whole story, though?

With every piece of data collected about each one of us, and each algorithm that comes across our individual online trails, we become entangled in a system that takes ownership of our private lives. Algorithms and data collection form these sticky webs around our online and offline lives — that we both knowingly and unknowingly get stuck in. Once we clicked the “turn off” button in our privacy settings, it might feel as if we own our data once again — that we have liberated ourselves from some of the algorithmic sticky webs laid out all around us. But what about the less educated people, those who don’t know enough about privacy, people who maybe wouldn’t even be able to navigate Facebook’s privacy settings? They are still caught up in that web.

While some of us might be worried about advertisers collecting data on our love for burritos and our favorite series, others — who might not even know what to look out for — are often part of a stickier denser part of the web: a web in which it’s not only Facebook and Netflix collecting their data.

At the beginning of this year, I came across an article published by ProPublica in 2016 about a software used across the U.S. intended to collect data on defendants, process it and get a score. The score would predict people’s future as criminals — and it was biased against Black defendants. My inner geek who loved the idea of using math to predict future events was terrified by that article for a long time. So, I went on a journey via Google to discover what other algorithms were being used by the government. During my search, I came across a website filled with tips on possible algorithms used by the government. Then, I got stuck. I realized I knew nothing about how these algorithms worked. From where did they collect the data? Did they have data on everyone? How could you even get your hands on these algorithms and study them?

Thankfully, Prof. Virginia Eubanks blessed me (and all of us) with her book: “Automating Inequality,” which brings me back to our algorithmic sticky web we are all caught in. Her book shined a light on the denser, hidden parts of the web where people from low-income families might find themselves caught. She delves into three algorithms used in different states: one that decides citizens’ eligibility for welfare programs, one that assigns scores to people who find themselves homeless, and one which tries to predict child maltreatment in families. Read again that last sentence. All of them focus on low income citizens. Why? Because the government can only collect data from those who access welfare programs, adding them forever to a database, creating a score out of their past in order to categorize their future.

Still, as enlightening as books can be, they rarely offer us a step-by-step guide to taking action. We are only college students and, for a lot of us, the closest we have come to algorithms are the recommendations offered by Amazon. There are some of us, like myself, who are not even American citizens, and so we obviously start wondering if all this algorithmic bias isn’t better fit for a Wired article or a late-night talk in the dorms than as a wake-up alarm for protests. I’ve definitely faced these questions and I am still navigating them, but I am looking at all of us when I am saying this issue needs to be solved and discussed. Math and engineering students, take a break from your classes and take the time to understand the world where social justice takes precedence over the power of numbers and coding. Humanities and social science students, take a break from writing your paper because if you don’t intervene, concepts like justice and fairness will get to be shaped by a score given by an algorithm. School has just started, so don’t get so busy with your coursework that you forget to take in the influence that your work might have in a society changing at the pace of an algorithm giving you a score.

 

Anamaria Cuza can be reached at anacuza@umich.edu

 

Leave a comment

Your email address will not be published. Required fields are marked *