In Which I Write Digital Security Lessons Disguised As A Celeb Gossip Post
For those of you not in the know, last weekend a hacker posted pictures of several female celebrities to 4chan, because 4chan is terrible, and then it was also picked up by Reddit and went wild there.
The details are still murky, Techcrunch provides a summary here.
What we know for sure is pretty sparse at this point. Some of the photos are confirmed to be genuine. Some are disputed as forgeries by the celebrities that are in them. The majority of the photos were taken with Apple devices (unknown if there’s a correlation between the sort of device used to take the photo and whether or not the photos are genuine).
There are a couple of possible vectors for this attack, some of them more likely than others. Most likely at this point (my estimation) is that someone used a phishing attack (or simply guessed account credentials, like the Sarah Palin email ‘hack’ in 2008), and gained access to the backup copies of the data from Apple’s iCloud service.
From the Techcrunch story:
“Also, Apple’s “Forgot my password” system means that if you know the victim’s birthday and the answers to some security questions, you might gain access to their account. There is a LOT of information out there on celebrities, so coming up with ideas for passwords is entirely possible.
Once inside it’s not possible to see photos or videos which are automatically uploaded from your iPhone to iCloud but you can use software to download it all. Again, voila.”
Lesson One: Security Questions Are Terrible
The reasons why security questions are terrible. In a nutshell, if it’s an easy enough question for you to remember the answer to it, it’s probably too easy to find the answer to the question on the Internet or social media.
Lesson Two: Unencrypted Backups Are Terrible
In the IT biz, data security is a serious problem. One of the difficulties in managing user data is that you typically don’t know what the users are storing, you don’t know how important it is (spoiler alert: some of it is really important), and you don’t know if they’re actually following your own organization’s data policies (spoiler alert: they are not).
Some administrative assistant probably has the CEO’s social security number, credit card information, and home address sitting in plain text in their email Inbox. From that one time that they needed their login information for the 401k web site so that they could change their contribution limits before a deadline, or something.
So when you back up your users’ data, you have to assume that it’s all pretty important. It’s not just the HR database or the financial system’s data store that is important.
At an enterprise level, I can make this decision for you. I can plunk all your data on a file server somewhere with some encryption rubbed all over it, and only I and the other tech staff with the right access know the key. So we can lose our backup tapes, and (unless we’ve rubbed terrible encryption on the data), nobody can decrypt the data. Someone can still attack your individual machine, of course, but it’s hard for someone to get *everything* in one attack.
The trouble with this is that I’m trained to keep good track of those sorts of keys. The average Joe or Jane is not.
Cloud-based data backup systems designed for end-users typically are going to be pretty insecure, because the average Joe or Jane doesn’t want to deal with encryption keys. They lose them, or they lose their passwords, or they forget their passwords. In a really solid encryption system, if you lose your key, the data is gone forever.
There’s no “send me my password”. There’s no way to reverse the process… absent the key, the data sits their encrypted until someone invents a quantum computer capable of sufficient operations to explode modern cryptographic techniques.
For most folks, “being able to get my data back after I haven’t treated it securely” is a feature, not a bug. Sorry, but… it’s a bug, not a feature.
The fact that you don’t take the time to weigh future costs against current convenience is pretty human-nature stuff, but then the future costs will eventually bite you in the ass.
Don’t use a backup feature that doesn’t encrypt the data. If you don’t know if it does or not, odds are almost certain it doesn’t.
Lesson Three: Privacy Is For Other People
Jezebel’s weigh-in on the story, linked above:
As The Daily Dot reports, Reddit’s own use policy would seem to prohibit posting this sort of thing to Reddit:
Reddit’s site-wide rules forbid the posting of “person information,” which these photos certainly seem to constitute. Posting “publicly available” information on celebrities is acceptable—but “it is not okay” to post links to “screenshots of Facebook profiles,” or anything potentially “inviting harassment.” If a users’ Facebook photos are a no-go, then it seems implausible in the extreme to suggest that stolen intimate photos could not also be considered “personal information.” Despite this, links to previous leaked photos shared on the site—including of Demi Lovato and Jennette McCurdy—remain live, several months later.
From Reddit’s policy directly: “Posting personal information will get you banned. Posting professional links to contact a congressman or the CEO of some company is probably fine, but don’t post anything inviting harassment, don’t harass, and don’t cheer on or vote up obvious vigilantism.”
Perhaps a more accurate version of this lesson is, “Privacy Is Something Web Sites Will Champion Up Until It Might Cost Them Site Views”.
Although it appears that someone at Reddit has taken some crisis management training and done the right thing:
The Mod team is well aware of the leaked pictures of Jennifer and we’ve decided to remove all of them and continue to keep this sub clean of them.
These pictures are available to view in lots of places just not here.
Sincerely, The Mods
… this isn’t a this isn’t a site-wide phenomenon:
Alright, here’s the deal. Everyone knows about the massive hack that has led to the explosion of private celeb nude photos ending up on reddit.
As of right now, we are still allowing these pics here in /r/Celebs.
Regardless of what you think, it seems pretty scummy (to me, at least) to allow this stuff here since it was obtained without the consent of the women involved.
A lot of the celeb-specific subs (/r/KateUpton, /r/JenniferLawrence, etc) are not allowing these images, and that’s fine. But for right now it is allowed here in /r/Celebs.
Also, there have been a lot of people complaining lately about posts being removed. With today’s insane traffic, the mod team is working their asses off to keep duplicate posts from showing up. Really, there have been at least 100 reposts of each pic/album today. Probably even more than that!
Lesson Four: Practical Advice Is Not Always Victim Blaming, But Making That Advice In A Public Forum, After An Event Like This, Makes You Look Like A Dick On Twitter.
Look, let me say what Ricky Gervais got correct:
Systemically, our consumer-grade information services are horribly insecure. I could link to the stories, here, but I’d probably die of exhaustion before I could even begin to attempt to provide justice to the subject.
Facebook’s security is terrible. iCloud’s security is terrible. Your underlying operating systems’ security is terrible, whether you use a Mac or an iDevice or an Android device or a Windows desktop or a Linux distribution.
No, Macs aren’t “more secure”.
Your hardware is insecure. Your phone is insecure. Your telecommunications network is insecure.
Your wifi access point’s security is laughable. Your bank’s is, too. Paypal is better than most, but still bad. eBay is bad. Your passport is hackable. Your trusted ID isn’t trustworthy.
They are all really, really bad.
I CANNOT STRESS ENOUGH HOW BAD IT IS.
There are basically only two things protecting you from direct consequences here.
The first is that there are a lot of you, and so the odds that you’ll be affected by any one particular data breach isn’t absolutely horrible.
The second is that any data breach that’s large enough to affect you is probably going to hit enough other people that it would be a PR nightmare for the company to do nothing about it.
When you trust your information to these systems, you are doing so blindly, probably with no real assessment of the risks involved. Most end-user agreements specifically disclaim any liability when you use these systems. You are agreeing that the company providing you the service can do so without any reprisal except bad publicity. If the publicity is bad enough, or it potentially affects the company’s bottom line, they may go to bat for you.
But they are under no legal obligation to do so, and that’s something that you agreed to when you “clicked here to continue”.
I’m not saying that’s the way things ought to be, but that’s the way they are. Unless and until somebody torts the bejeezus out of one of these services and wins (and the track record on that is not great, as SCOTUS hasn’t expressed much of a desire to hear cases about EULAs)…
…or you see a real data privacy law with teeth in the U.S. (which is probably a real necessary precondition to the tort option getting off the ground)…
… you are swimming with sharks. No, that’s not fair to the sharks, sharks aren’t that dangerous.
The only thing preventing you from being a victim of consequence is pure luck. And if you’re famous, you’ve got a nice, big, fat target painted on your sexy selfies.
I don’t say this to say, “Jennifer Lawrence shouldn’t have a right to privacy”, because she should have the same privacy rights as all the rest of us. I’m a fan of privacy, for whatever that’s worth.
I’m making an empirical observation of the state of digital security systems, the incentives built into the systems to prevent any real liability on the part of the institutions providing the systems, and the lack of incentives directed at the users to even know what they’ve gotten themselves into.
We don’t have privacy, we don’t have security, and the reason why is that we don’t want to pay for it, in either money or time.
And that will not change until we decide that we want to pay for it, in both money and time.
Because instituting even reasonably secure systems for things like this costs users time, and it costs companies money, and they will pass through those costs to the users… and they won’t even offer the services unless there are sufficient users out there who are willing to spend the time to use reasonably secure systems.
There aren’t sufficient users out there.
I was at a security conference once a while back and a CSO of a large financial institution was talking about online banking during a panel discussion.
He was talking about all of the problems with online banking… you can’t trust that the user’s terminal is secure, it’s hard to manage the user’s identity authoritatively, it’s hard to have a sufficient corpus of transactions to do real transaction tracking, etc. Someone from the audience said, “Well, it sounds like securing a banking website is basically impossible, so why do you offer the service at all?”
One of the other panel members was a biz ops guy from a different financial institution, and he piped up, “because if we don’t offer online banking, our customers will leave and go to competing organizations that offer online banking. Period. Whether or not we can make it secure, the customers demand that we provide it,” to which a legal guy added, “and since we can’t make the users use the service securely, we have to protect the financial institution from liability”.
Organizations respond to incentives.
This leads us to…
Lesson Five: There Is No Security
Regardless of whether you’re famous, or wealthy, or neither… somebody out there is trying to hack into some system that has some data in it that could be used to make your life miserable… and theirs moderately-to-immoderately more lucrative.
If you’re famous or wealthy, though, multiple somebodies are trying to hack your stuff right now. If you have digital data that you’d be embarrassed about, you’re going to have to pro-actively learn why it’s hard to protect this data and take real steps to protect yourself, or you’re going to have to develop a thick skin, because that data is going to be out there.
(image credit: via E! Online.)