David Harris: Legislating math
Of all the concerning things in the world of politics, legislating math has to be one of the wildest things politicians could try to get away with.
Most notoriously, a bill in Indiana penned back in 1897, later termed the “Indiana Pi Bill,” was introduced to make a certain mathematical method of “squaring the circle” officially established by law. It’s no mistake that “squaring the circle” has become a euphemism for doing the impossible, because it is, in fact, impossible. For whatever reason, someone’s erroneous proof somehow made it into the legislature, and if it were to have passed, this bill would’ve had weird secondary consequences such as technically making the value of pi in the state of Indiana 3.2 by state law. Clearly, mathematical laws saw that’s impossible.
It’s a ridiculous idea — math can’t be legislated, as it rests outside the domain of the law. Perhaps you could chalk that law up to just to its occurrence in the 19th century, but even today we see direct applications of mathematics attempted to be legislated, as state legislatures in California and New York have introduced bills that would ban encryption that is unbreakable to law enforcement in personal devices. Taking it a step further, United Kingdom Prime Minister David Cameron is on record as wishing to ban strong encryption entirely, which elicited responses from the technical community that stated Cameron very literally “had no idea what he was proposing.” Because at its core, encryption is simply a widely available application of mathematics, such that Wikipedia co-founder Jimmy Wales directly compared that a ban on encryption would be like banning a form of mathematics itself.
Encrypted data is like some secret code, used to scramble messages or data in way such that it’s unable to be read by any unauthorized third party who does not have the decryption key. It’s publically available in many different forms, and when properly designed using things like large keyspaces and other cryptographic techniques, the code becomes extremely hard to crack even with millions of dollars of computer hardware. In the realm of technology, it’s used on individual devices to privatize data, usually to prevent access by a thief or other unauthorized party, but is also applied to communications and Internet traffic for everyday users to prevent others from snooping on their activity. In a nutshell, addresses starting with “https” encrypt the transmitted data, while those starting with “http” are unsecured.
It’s a major defense against tools that are used to commit crimes like identity theft, which the Department of Justice’s Bureau of Justice Statistics estimates to affect 17.6 million Americans a year at a total cost of $15.4 billion. Along with a right to privacy, encryption has become the backbone of safely using devices on the Internet, but it’s this safe privacy that has caught the ire of law enforcement as it prevents them from accessing possible evidence from devices used by perpetrators.
The debate between privacy advocates and law enforcement came to a head this past month when Apple published a customer letter announcing they would refuse to comply and fight a court order to weaken the security on their devices so officials could access a phone used by one of the suspects in the San Bernadino shooting case. The FBI argued it would be a one-time deal, while Apple in its customer letter said such an action would create dangerous standards that would undermine the security that protects all of its customers. While one should be sympathetic to the cause of investigators in these cases, Apple is right.
Since October 2015, government authorities have also requested access to 12 other iPhones under the jurisdiction of the ancient All Writs Act of 1789. Despite their insistence on needing Apple’s help, 11 of these devices run older versions of Apple’s iOS software with existing public vulnerabilities that would allow investigators to extract the data. It presents the fact that the FBI and investigators aren’t just looking for the data: The FBI is looking for a precedent that would either weaken encryption or insert some “backdoor” into the encryption that would allow them access to any phone.
On the surface, it seems OK: The authorities could only seize your phone with a warrant, and they’re the only ones who would be able to break the encryption. However, such an ideal is a great risk. It’s already incredibly difficult to write secure software, and writing it in a way such that only the “good guys” can break through the security is something that security experts say to be somewhere in between impossible and an incredible risk. If there is to be a solution to this seeming tradeoff between rights to privacy and impedance of investigations, one that involves weakening the privacy of everyday consumers and putting them at risk of attack from nefarious citizens cannot be it. And bills that go to even greater extents, such as one being pushed in the United Kingdom that would force backdoors in both device and Internet encryption, would have devastating effects. Encryption is what makes so much of using the Internet, computers and smartphones possible. Weaken it and take it away and you’ve turned cyberspace into the wild west, hurting only the everyday citizens who rely on their technology every day.
One of the few congressmen with a background in computer science, Rep. Ted Lieu (D–Cal.), introduced a federal bill that would stop states from instituting their own bans on the sale of encrypted devices. Such bans, despite valid public safety concerns that are the cause for their introduction, simply aren’t practical at all, as it’s impossible to stop the flow of technology through borders, even in places with strong laws against it like China does. The technical community, including behemoths like Microsoft, Google, Bill Gates and many other figures, has come out almost unanimously in support of Apple in this case. The Department of Justice, in a brief, even suggested that they could compel Apple to turn over their source code to the FBI, presenting a scenario where the government itself could write in the backdoors (like they’ve already been accused of doing in many other pieces of software through the National Security Agency). The fact that these tech companies could be forced to become puppets of the justice system to this extreme should be alarming, and the technical community sees that.
For now, the courts have sided with Apple in a similar case, such that Apple will not have to introduce exploits in their system available for use by law enforcement. While many lawmakers and citizens will try to see nuance in the issue and say there must be some middle ground between privacy and law, there’s an unfortunate reality that in cases like these, when it comes to technical encryption, we’re faced with an all-or-nothing approach where introducing backdoors or weakening standards is not a viable solution. The cases are tragic, but using them as emotional catalysts to seize an opportunity to strengthen investigative positions in a way that harms the everyday interactions with technology is far more authoritarian than the situation warrants.
David Harris can be reached at firstname.lastname@example.org.