Last week we learned that Facebook and Google have been paying people for access to almost all their personal data. Both companies employed research programs that paid participants as young as 13 years old up to $20 per month for access to their private messages, photos, emails, web history and location data. Facebook also requested screenshots of participants’ Amazon shopping history.
I’m not shocked. That would be naive. Programs like these are the obvious next step in our data hungry world. A number of startups have emerged promising to empower individuals to “own their own data” and connect them with organizations who want to pay for it.
Often data monetization proponents justify the practice by portraying it as a choice. For example, Facebook has defended their program, saying users were aware of what they were gathering and compensated for their participation. Essentially, Facebook argues participants made an informed choice to hand over intimate details about their lives for $20 per month.
The problem with the choice argument is that without protections for individuals and clear enumerations of the rights they possess, there may be no real choice at all. Users frequently don’t understand what they are agreeing to when they hand over data and the pressure of leaving money on the table may influence people’s decisions. Furthermore, who enforces this exchange? What sort of rights do you have to dictate how your data is used and if it is resold? Should certain types of data never be for sale? We lack the regulatory frameworks necessary to begin answering these questions.
If pay for data programs are allowed to run unchecked, there could soon be two worlds of technology use: one for the rich and one for everyone else. The rich will be able to resist the extra flow of money — a little additional cash won’t be enough to convince them to let companies into their private lives. The same won’t be true for everyone else. As much of the world loses any sense of privacy, the privileged will enjoy the benefits. Better voice assistants, more relevant recommendations, and quicker commutes – all built on the personal data of the masses. Companies will profit from unprecedented targeted ad opportunities. Without steps to ensure privacy is a right instead of a privilege, the gap between the rich and poor will widen further.
To avoid this dystopian future, the U.S. should codify people’s data rights and implement a regulatory-enforcement system to uphold them. Only after that can individuals represent themselves fairly in any data monetization interaction. Outlawing pay for data programs isn’t realistic or necessarily desirable, but we can help ensure the exchange is fair by truly empowering individuals with privacy standards in their best interests.
I envision a system in which procedures involving personal data are similar to the current procedures for research on human subjects. In research with human subjects, participants must clearly understand the process and its potential consequences. Additionally, research is audited by an Institutional Review Board charged with protecting the welfare, rights and privacy of human subjects.
If a similar system was implemented for the data economy, we could preserve innovation while protecting individuals. Your level of privacy would not be determined by your level of privilege, and you could make true, informed choices about when you give up that privacy. Companies would still have avenues to collect valuable data, but we could ensure citizens are not manipulated for intrusive digital experimentation.
Empowering individuals to control their data is a noble cause, but monetization without a general privacy framework may have dire consequences. Without proper protections, the vulnerable may be exploited under the guise of free will — pigeonholed into transforming their most personal information into fuel for the digital revolution. Monetization may be a useful tool someday, but before we answer basic questions about data privacy, it’s a harmful mirage.
Chand Rajendra-Nicolucci can be reached at chandrn@umich.edu.