A couple of weeks ago, a friend told me to apply for an internship at Palantir, a company that specializes in big data analytics. I scrolled through the job’s description, wrote a quick cover letter and left my application untouched for a couple of weeks.

At the very same time, the Los Angeles police were stopping people in low-income areas without any clear reason. Manuel Rios, a 22-year-old who lives in the city’s gang populated area was stopped dozens of times ever since a police officer took a photograph of him after being seen with a gang member. Rios was never part of a gang, but when the police officer took his photograph, he was told, “Welcome to the gang database!” That “gang database” was created by Palantir in collaboration with the L.A. police, automating the targeting of supposed gang members. Once a person’s profile enters that database, it gets stuck there forever, and it gives police officers an additional reason to constantly invade a person’s privacy. For Rios, who has spent most of his life doing his best to stay away from gang activity, it only draws him closer to it.

This is one example of the many collaborations of which Palantir is part: from working with J.P. Morgan to spy on its employees to having one of its employees work with Cambridge Analytica to use Facebook’s users’ personal data. As J.P. Morgan’s former chief information officer, Guy Chiarello put it, Palantir turns “data landfills into gold mines.” The kind of “data landfills” he is talking about allow the company to build both police and immigration departments models that match citizens with labels such as “Colleague of,” “Lives with,” “Owner of,” and “Lover of.”       

Palantir’s blog posts talk about its belief in regulations that govern data privacy, and about its Privacy and Civil Liberties Team. Their actions, though, tell a different story. Only later did I find out that in a company managing hundreds of projects and thousands of employees, this particular team consisted of only 10 people. I deleted my internship application.

There is a lot of talk nowadays about algorithmic bias — which makes a lot of people think there is something wrong with the algorithms themselves. In Palantir’s case, though, it’s not the algorithms that have a problem, but the data that the company uses and the way it uses the information. But can we restrict the manner in which a company chooses to use its data?

Hannah Fry, a mathematician who recently wrote a book on how algorithms affect our society, has been advocating for creating a regulatory government body to check the type of data and algorithms that companies are deploying. At this moment, any company is allowed to use any data, in any manner it wants. It’s the equivalent of having any company put any substance in a bottle and sell it under any label. The same way the FDA protects both the intellectual property of companies, while also ensuring the benefits of a product outweigh its harms. Is there a way to do this with algorithms? Probably. But as long as Silicon Valley is on track to be the top corporate lobbying spender, companies will likely continue doing whatever they want with our data.

People are increasingly not willing to share their data with companies. Governments are casting a suspicious eye toward companies collecting data from their customers. Right now, we live in a time when companies like Palantir are generating a lot of profit by leveraging data that users and governments unknowingly give up on. But what will happen five, 10 years from now? What will happen when all citizens will understand that their right to privacy is intertwined with the data they share? What will happen when more governments will decide to protect their citizens instead of their tech world benefactors? Will companies like Palantir even be feasible?

It was only after a lawsuit that the Los Angeles Police Department released documents about Palantir’s surveillance algorithm they were using. The result? It unveiled the fact that the algorithm propagates the disproportionately high number of arrests of Black Angelinos and other minority groups. Palantir understood the data that it was using to develop this algorithm. It would had been able to predict this risk. But it didn’t. Until the government starts taking action in setting restrictions on the data these companies use, it is our responsibility to understand the risks of misusing data. It is our responsibility to continue questioning these companies (maybe even filing a lawsuit against them), and to consider the ethical questions when applying to internships.

Leave a comment

Your email address will not be published. Required fields are marked *