In a decision with potentially large ramifications, New York Federal Judge LaShann DeArcy Hall won't dismiss a libel suit against "Shitty Media Men" creator Moira Donegan.
Explaining, the judge says it is possible that Donegan created the entry herself. The judge believes that Elliott should be able to explore whether the entry was fabricated. Accordingly, discovery proceeds, which will now put pressure on Google to respond to broad subpoena demands. The next motion stage could feature a high-stakes one about the reaches of CDA 230.
Why the President Is Wrong on Encryption
Last week, President Obama took a position on the general subject of private-party encryption at the South by Southwest festival in Austin. This is relevant to Apple’s dispute with the FBI regarding the contents of a locked iPhone, which has come up regularly in various discussions on these pages. Quoting the President at some length:
The question we now have to ask is, if technologically it is possible to make an impenetrable device or system, where the encryption is so strong there’s no key, there’s no door at all, then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? If in fact you can’t crack that at all, government can’t get in, then everybody’s walking around with a Swiss bank account in their pocket.
If your argument is strong encryption no matter what, and we can and should create black boxes, that I think does not strike the kind of balance we have lived with for 200, 300 years, and it’s fetishizing our phones above every other value.
I suspect the answer is going to come down to, how do we create a system that, encryption is as strong as possible, the key is secure as possible, and it is accessible by the smallest number of people possible for the subset of issues that we agree is important.
The President is wrong across the board. Let me take things one piece at a time.
The question is not if the technology is possible. The technology already exists. The day after the President’s remarks, the New York Times published a story about WhatsApp. WhatsApp, now owned by Facebook, provides an application that allows its users to send messages and make phone calls over the Internet. (I’m using “phone calls” here in the sense of streaming audio sent in both directions over the Internet. The conventional telephone networks are not involved.) More recently, WhatsApp added an option to allow users to encrypt their text and audio. WhatsApp encryption keys are stored only in the end-user devices. The Justice Department summarizes the situation saying that they have a wiretap order approved by a federal judge, but are unable to implement it due to WhatsApp’s encryption. The encryption is strong enough to resist brute-force attacks, and the algorithm precludes the existence of “back door” keys.
Walking around with a Swiss bank account in my pocket? D’uh! This is the promise of the digital age. The device in my pocket contains, or will soon contain, a library, a concert hall, a movie theater, a still camera with a near-endless roll of film, a video camera that records my family moments, and a dozen other virtual devices. It provides access to my bank accounts, my IRA, my medical records. More months than not you read about another government agency or large corporation whose copies of those things was cracked. I damned well better have a Swiss bank account in my pocket – no one else is taking care of my data.
The past 200 years doesn’t count. It was the age of paper, where copies were hard to make and security meant file cabinets behind locked doors. The Postal Service was largely responsible for transporting sensitive information. That system made it difficult to intercept, difficult to hide when interception had occurred, and came with stiff punishments for even casual violations of the privacy guaranteed by the government. In the contemporary world, government(s) decided more than 20 years ago in favor of open high-speed data networks with no inherent security mechanisms. The market was allowed to decide that enormous amounts of sensitive information would be stored on computers with operating systems for which actual security was an afterthought. Strong encryption is inevitable, because there’s no other way for businesses to protect information.
The government has a very poor record on limiting access to “need to know” agencies and individuals. Ed Snowden ruined his life demonstrating just how untrustworthy the security apparatus can be. There is no particular reason to believe that any backdoor keys will stay within the defined boundaries. That government agents won’t overreach. That giant corporations won’t give in when the feds lean on them. With regard to that last one, Apple is an exception. When the government applied pressure, or in some cases just made a request, AT&T, Verizon, and others rolled over.
I feel obligated to say something about child pornography. Producing it with live children ought to be a crime. Distributing it at second- or third- or fourth-remove from the production is a harder question. Mere possession is still more difficult. Chances are that some of us here have had such images stored on the computers or phones we own, either now or in the past. Most users set their browsers to cache images; most users set their browsers to allow promiscuous downloading of content by code that they don’t control; sh*t slips in. Is the computer on which I’m writing this part of some mad Bulgarian’s botnet? I suspect my computer is not running such software, but I’m not sure. One of Cain’s Laws™ says that your computer isn’t secure unless you know what software is running on it. This particular law has become impractical to follow. There are 225 processes currently running on my Mac and I don’t know what they all do. Who’s guilty of what if the Bulgarian mafia stores child pornography in obscure corners of my hard disk, and malware serves it up? If the Bulgarians use strong encryption to protect it?
This is not a subject on which I’m amenable to any sort of compromise. Either I can have strong encryption for my data, or I can’t. Yes, strong encryption makes law enforcement a more difficult task. But there’s no such thing as encryption that’s “a little bit” weak. The President’s position appears to be that I can’t have it, that the government must be provided with back doors so they can listen to my encrypted audio and read my encrypted files. In one very real sense, though, he’s wrong about that, too. He and Congress can make it illegal for Apple to provide me with strong encryption. They can make it illegal for me to use strong encryption. But short of prison, they can’t stop me from writing code to implement it on my own. The genie is out of the bottle.