Cut The Crap, Apple, And Open Syed Farook’s iPhone

Will Truman

Will Truman is the Editor-in-Chief of Ordinary Times. He is also on Twitter.

Related Post Roulette

81 Responses

  1. Road Scholar says:

    As near as I can tell from Tim Cook’s statements, he seems not to have complete confidence in Apple’s own internal security since he expressed concerns about such an update/hack getting loose in the wild.

    Otherwise the objections appear to be centered around corporate PR expressed in abstract philosophical language. The most cogent objection, IMHO, revolves more around the notion of the government ordering Apple to create a piece of software that doesn’t already exist. I can’t point to a principle behind it, but I would be more sympathetic if we were talking about an individual rather than a corporation.Report

    • Glyph in reply to Road Scholar says:

      I would be more sympathetic if we were talking about an individual rather than a corporation.

      I know “corporations are bad” and “exist only at the behest of the people” and all, but look at it this way – being unsympathetic to Apple’s position here, allows both the power and know-how of the government, PLUS the power and know-how of a wealthy megacorporation, to be trained against a single criminal defendant.

      That seem like a place worth going, just to put corporations in their place? Seems to me like cutting off the nose to spite the face.Report

      • Doctor Jay in reply to Glyph says:

        Not a criminal defendent, but anybody who simply knows how to do something that the government doesn’t know how to do. I note that the criminal in question is dead. This is for a fishing expedition.

        So, do you support the idea that the government can compel a person (or company) to make something to help them with their intelligence gathering fishing expeditions? I sure don’t.Report

    • Damon in reply to Road Scholar says:

      Corporations are considered individuals in many ways, so…..Report

      • Glyph in reply to Damon says:

        I don’t even want to open that can of worms.

        But people who feel corporations get away with screwing the little guy, and thumbing their nose at the government too much, should IMO beware that their reflexive animus against corporations would, in this instance, bring the corporations under the government’s boot (yay!) so as to make that boot all the heavier when it comes down on the little guy (whoops!)Report

        • Damon in reply to Glyph says:

          Totally.

          I would however, point out to those people that bitch about the corporations screwing them, that the gov’t is screwing them too, and most people, when given enough power, would screw them as well.Report

          • Morat20 in reply to Damon says:

            People concentrate on the power currently irritating them. Which really, in an educated populace, shouldn’t prevent them from remembering that the current irritating power might ALSO be protecting them from a more irritating power.

            Which doesn’t mean to bend over and take it — it means that you should be really careful with “raze it to the ground” rather than “reform it root and branch”. But angry people prefer the former — it’s far more satisfying, at least short term.Report

  2. Autolukos says:

    No, what the FBI wants to do is to create a precedent that requires OS makers to disable security features on demand. If they prevail they will be back, and whatever measures are taken to re-secure devices will be the next target.Report

    • Autolukos in reply to Autolukos says:

      Slate has a discussion of the politics of the FBI’s decision to press on this particular device.Report

    • Michael Cain in reply to Autolukos says:

      The more interesting question is what happens when Apple introduces the model 8-Secure (or whatever) that Apple can’t crack with software. Based on everything that’s been published in the last few days, it appears that the crypto co-processor’s boot loader (ROM that’s part of the IC mask) could detect that its firmware has changed and clear the user-space encryption keys. That’s occasionally a problem for the users — how often does Apple change the co-processor’s code? The questions then, are whether the courts are willing to ban certain designs, or whether the executive claims that power within authorizations already granted by Congress, whether Congress has to be involved, and whether Congress will do it?Report

  3. When some reporter, writing about something in which he is not an expert, says “It’s as simple as this”, It’s rarely as simple as that. When the issue is political, it’s almost never as simple as that. And when he writes for a rag like the Federalist, the odds are astronomical.Report

  4. Mo says:

    Even though I agree with Apple’s stance, it does not appear that the FBI is entirely wrong.

    Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity.

    This seems to indicate that Apple would have complied if the FBI didn’t tell anyone that they complied.Report

    • notme in reply to Mo says:

      If that is true and I doubt it, then all the Apple talk that it really cares about privacy it just a lie. I mean why would the FBI care if the application was under seal if they could get what they want? That makes no sense.Report

  5. Burt Likko says:

    It may interest you to know that I just completed a jury trial before the District Judge assigned to oversee this case. The magistrate who signed this warrant is in the courtroom right next door. So that’s kind of cool.

    Apple doesn’t have much of an argument here, I think. But they did hire Ted Olson (!) to handle the matter and I expect his argument will deal with something that Con Law geeks like me will think are cool and fun, and that militia and sovereign citizen types will misconstrue horribly.Report

    • Will Truman in reply to Burt Likko says:

      So that’s two lawyers arguing that Apple’s case here is weak. Maybe since you don’t write for The Federalist, Mike will consider your perspective.Report

      • Burt Likko in reply to Will Truman says:

        A big part of why I think Apple is destined to lose is a large number of licensing agreements and interstate communications regulations. My understanding is that there are already significant legal back doors obligating tech companies to cooperate with federal security investigations, as a condition of accessing streams of interstate communications. I might be wrong about that. But if it turns out such causes are in the licensing agreements, it may turn out that Apple is effectively contractually obligated to cooperate.

        Failing that, the warrant in this case appears to be very narrowly tailored. The basic argument seems to be based upon a claim that Apple needs to come up with new code in order to accomplish the assigned task of rewriting the system in the subject telephone. But I thought that this software update already exists, and it is just a matter of deploying that already existing software. So the search warrant here does not strike me as materially different than a search warrant that has been issued for other kinds of electronic data, like email or databases. A general search of all emails are all databases is one thing, a tailored search is better, and complying with such a more tailored search does require the application of some kind of technical skill.

        If there wasn’t a warrant, we’d be having a very different discussion. But there is a warrant.Report

        • Will Truman in reply to Burt Likko says:

          So you think Apple’s attempts to quash the warrant on appeal are not good? The narrowness of the warrant and that it did balance the needs of both parties did seem to be a stronger part of Malor’s argument. I am a little twitchy on the potential ramifications of expansion if it occurs (though that’s not the magistrate’s problem).Report

        • Owen in reply to Burt Likko says:

          I believe you are mistaken about the compromised software already existing. Not sure whether that is relevant from a legal perspective though.

          The existence of a warrant does not really address the security design problems. The tool that Apple is being asked to develop would be generally applicable to many if not all other iPhones. Users could “encrypt” their data, but the government could gain access to it provided they can convince a judge. That is indistinguishable from every other “backdoor” that has been proposed, and it has the same security problems as those other proposals.Report

          • Michael Cain in reply to Owen says:

            Existing or not, we can be pretty darned sure that said software hasn’t been signed. The EFF has a different take on this, that Bruce Schneier agrees with, that the signature process at Apple is very tightly controlled — multiple steps at different locations, safes/vaults, officer-level observers. No one outside Apple, except perhaps an insurer, knows how much of the normal process has to be bypassed in order to sign an OS version that doesn’t come in through the usual chain, and has to be pulled out of the chain at an appropriate point.Report

            • Glyph in reply to Michael Cain says:

              God bless the EFF. Nice piece, thanks for the link. For those who don’t click through, the conclusion:

              Summary

              EFF supports Apple’s stand against creating special software to crack their own devices. As the FBI’s motion concedes, the All Writs Act requires that the technical assistance requested not be “unduly burdensome,” but as outlined above creating this software would indeed be burdensome, risky, and go against modern security engineering practices.

              Report

              • Kazzy in reply to Glyph says:

                Can @burt-likko or others explain the All Writs Act?Report

              • Will Truman in reply to Kazzy says:

                The article itself has a pretty good description.Report

              • Owen in reply to Kazzy says:

                The best article I have seen is this one which outlines some relevant cases where the act was applied or rejected.Report

              • Glyph in reply to Owen says:

                Also a nice piece, which touches on Kazzy’s question about whether or not Apple “owns” the phone:

                “However, Bridy, the University of Idaho law professor, told Ars on Wednesday that Apple will likely attack each of the government’s arguments head-on.
                First, Apple probably will argue that it is, in fact, removed from the case at hand and should not be forced to assist.

                “Long ago when Apple put that phone in the stream of commerce, it gave up any proprietary interest in that phone,” she said.”

                (The sidebar images of Lego storm troopers attempting to access iPhones are maybe a bit much, but frankly kind of hilarious).Report

              • Morat20 in reply to Glyph says:

                “Long ago when Apple put that phone in the stream of commerce, it gave up any proprietary interest in that phone,” she said.”

                That’s not what their EULA says….Report

              • Glyph in reply to Morat20 says:

                Heh, yeah. And I commented elsewhere that cases like this might compel tech companies to reconsider their EULA position, since it leaves them potentially on the hook.Report

            • No one outside Apple, except perhaps an insurer, knows how much of the normal process has to be bypassed in order to sign an OS version that doesn’t come in through the usual chain, and has to be pulled out of the chain at an appropriate point.

              P J O’Rourke does.Report

            • But the Federalist guy says there are no issues other than Apple being objectively pro-terrorist. And he’s a lawyer and writer in Washington DC, so obviously he must understand low-level digital security forwards and backwards.Report

              • Well, that’s one way to read what he said. Another way is that the magistrate made an appropriate ruling and struck an appropriate balance in this case. Which seems to be Burt’s opinion as well as the magistrate’s.

                All of which could be wrong, but the notion that their views are disqualified from consideration is absurd. Even if one of them contributes to an ungood and unclean site.Report

              • He can gave all the opinions on the law he wants. But his repeated insistence that there are no issues with Apple’s doing what the court ordered is simply ignorant.


                Second, the order requires Apple to explicitly restrict its software update so that it can only run on Farook’s iPhone and be both temporary and reversible.

                Software doesn’t come into existence because someone said “make it do that”. Nor does it necessarily do exactly what was originally intended.


                Claims that this order endangers the privacy of iPhone users are simply untrue
                .

                Creating a procedure to crack an iPhone ipso facto endangers the privacy of other iPhones.

                Until this week, discussions about removing the auto-erase and delay features of passcodes have never considered such an alteration to be a backdoor. Uses of the term to refer to the order in this case are thus misleading.

                He’s now an expert on security-related jargon too. And, unsurprisingly, wrong. Any built-in way to bypass security is a backdoor.

                You can see from Michael Cain’s links that people who actually understand this stuff have come to very different conclusions.Report

              • His opinions on the law appear to be the court’s as well. Maybe other courts will come to other conclusions. In any event, given that this is a court case I found Malor’s (and Burt’s) explanations to be quite helpful in looking at the issues at stake.

                I think a lot of the counterarguments in the comments section here is quite good. The ones that actually address the statements in the article, I mean.

                On the technical end, I’m not sure. I hope that if it is as difficult as Apple suggests, a higher court finds it so. That Apple apparently seemed prepared to do it if kept under seal does make me wonder a bit, though.Report

              • Glyph in reply to Will Truman says:

                “That Apple apparently seemed prepared to do it if kept under seal does make me wonder a bit, though”

                Two things:

                1. I see nothing wrong with a company deciding that word getting out is bad PR, so now don’t do it.

                2. As people note, the burden on Apple may increase exponentially once it is widely-known they can and did do it. Other governments may now come to them, more cases. Something that was a headache to do this once (but doable), can become untenable once the OPEN FOR BUSINESS sign gets turned on. I’ve done a favor for a friend at work, but told them don’t tell anyone, because I cannot do it for everyone or I’d get nothing else done.

                EDIT: I missed that you are talking strictly about the technical difficulty. Strike the above, as they refer to other considerations.Report

              • Will Truman in reply to Glyph says:

                I can completely understand why Apple would would want to keep it sealed, for both reasons. But the vibe I’m getting from critics of the ruling* is that the ability to do this does not exist and setting it up would be an unreasonable burden. That’s not the same thing as “Having to do this regularly constitutes an unreasonable burden.”

                As Burt says above, future cases are not this magistrate’s problem. This was a specific ruling on a specific case. A lot of the “This would means” are conjecture** on the basis of what future courts would do. Which is outside the scope of this ruling.

                * – To be clear, I am undecided on this case. I lean towards assuming that the magistrate’s ruling is probably the correct one. The concern I have is the precedent and the slope.

                ** – Not baseless conjecture! As I say in *, this is a concern that does resonate with me. I could easily see the next judge or magistrate using this ruling as a justification for something broader and less reasonable. OTOH, siding against Apple here might help avoid congress passing a law that more explicitly gives the government more authority. It might be easier, for example, to pass a law saying “There always must be a backdoor” than it is to interpret existing law that way.Report

              • The fun case is still down the road, if/when Apple makes the decision to value security a bit more and, in next-gen versions of the iDevices, makes the on-silicon encryption co-processor boot loader wipe the encryption keys if its firmware is modified while the phone is locked. Yes, users lose data that wasn’t backed up if they forget their passcode. OTOH, software attacks, even with Apple-signed code, are no longer possible.

                See comment below. Bad guys who want strong encryption can already have it. The civil-rights question here is whether consumer devices are allowed to provide it.Report

        • Michael Drew in reply to Burt Likko says:

          Obviously I don’t have the knowledge to agree or disagree with Br. Burt here, but I do want to amplify his point about regulations coming I the eventual settlement of this.

          Which is to say, I suspect if Apple were to prevail on this, the response would be perhaps not swift but certain: Congress would simply update telecommunications law to prevent companies from producing phones whose data the producers wouldn’t be able to access in respondent a court order, which would be the precedent set here. I don’t think it would be very complicated to do, and while there would be a fight about it, it’s a fight opponents of the change would certainly lose.

          That’s why I’m more just interested in the outcome than concerned about it.Report

          • Glyph in reply to Michael Drew says:

            “Congress would simply update telecommunications law to prevent companies from producing phones whose data the producers wouldn’t be able to access in respondent a court order, which would be the precedent set here”

            And I wonder why we as a society see that legislative outcome as desirable, or inevitable.

            To me, that would be like mandating that producers of paper only manufacture paper which cannot be burned, obviating one of the very basic reasons some people would prefer to keep their records on a medium which IS concealable/perishable.

            Is it only criminals who would ever wish to hide or completely destroy data? Is there a valid argument somewhere that the government must ultimately be able to access every datum ever created or the Republic will fall (and if so, let’s hear it)? We’re OK with me holding a device that can shoot a human dead, but not one that holds data the FBI is unable to read? What the hell is wrong with us?Report

            • Oscar Gordon in reply to Glyph says:

              Is there a valid argument somewhere that the government must ultimately be able to access every datum ever created or the Republic will fall (and if so, let’s hear it)?

              This.

              I’m fine with a warrant meaning law enforcement can gain my data. I can even accept that a court can attempt to compel a person to give up a passcode. But we seem to be heading to a place where government lays claim to all data all the time because there might be some investigative value.

              Remember, the FBI has no idea what is on that phone. They suspect it may contain communications to other terrorists. It might just have a good recipie for falafel and cat videos.Report

              • Glyph in reply to Oscar Gordon says:

                Don’t worry, Oscar. Any concerns that we are heading toward a place where tech companies are compelled to provide access to anything ever for the purposes of omniscient, unblinking information-gathering is simply nervous-nelly slippery-slope alarmist thinking. There’s always a good reason why this case, in particular, has precedent that means it’s A-OK to be decided in the government’s favor; I’m sure the next one, or perhaps the one after that, will halt the slide.

                I’m donating to the EFF today. I encourage anyone who feels similarly to do the same.Report

              • Glyph in reply to Glyph says:

                Which reminds me of something – it always strikes me as slightly funny when people discussing legal cases dismiss slippery-slope concerns as invalid reasoning…when pretty much the entirety of our legal system is based on slippery-slopism.

                What else is the very concept of “precedent”, except something that is pointed to in order to simplify or justify the decision in front of us currently?

                Isn’t it obvious that legal precedent necessarily shapes the subsequent decisions, and therefore the overall vector?

                And as such, being concerned about setting a bad precedent (even if in this specific case, we are OK with the outcome) is therefore a valid concern, and something to be avoided when possible?Report

              • Stillwater in reply to Glyph says:

                On a more political level, I’m reminded of various discussions about Big Gummint data collection and NSA surveillance of communications and TIA all the rest I’ve had over the years where the argument (appears to be!) that government has a right to all the data out there simply because it’s out there (“public” in some sense of that word), or because private companies having access to implies that gummint has a right to it too. Or something.

                Seems to me that *that* type of logic presumes something that simply isn’t the case: that gummint possesses a prima facie right to engage in these types of activities, a right which requires defeating. Which makes no sense to me.Report

              • notme in reply to Oscar Gordon says:

                Remember, the FBI has no idea what is on that phone. They suspect it may contain communications to other terrorists. It might just have a good recipie for falafel and cat videos.

                True, however determining the contents of the phone is a valid line of investigation for law enforcement. Arguing that the cops shouldn’t be allowed to get the info on the phone is like arguing that the cops shouldn’t be allowed to question someone bc that person might not know anything useful.Report

              • Oscar Gordon in reply to notme says:

                I didn’t say that accessing the phone was not a valid investigative goal. The guy is dead, and we are pretty certain he committed mass murder. I have no problem with the FBI going over all his data with an electron microscope.

                My concern is with the precedent that may be set by requiring that Apple develop an exploit to crack the phone. It’s one thing to ask Apple to reset the iCloud password, something else entirely to ask them to crack their own data security*. Over what, a hope that there is something interesting on the phone? Perhaps if they had evidence that another attack was planned, and the phone had the details, I’d be more inclined to agree that Apple should do a one time exploit. But that does not appear to be the case. The FBI just wants to see who else the guy might have been talking to. I know a judge disagrees, but this is a hell of a lot of work for a fishing trip.

                *Especially when it comes to light that had the government not been massively stupid, the phone would have likely pushed everything of interest up to the iCloud in short order and none of this would be necessary.Report

              • Michael Cain in reply to notme says:

                But of the parties involved, Apple is the only one that seems to be looking ahead to the possibility that leads could be developed, and cases tried in court, where the contents of the phone are necessary evidence. In that event, it is entirely possible that the defense will be successful in forcing Apple to reveal a good deal of their internal methodology, provide source code for outside experts to examine, etc. It’s not like that hasn’t already happened this month.Report

              • Oscar Gordon in reply to Michael Cain says:

                Pretty much everything but the software signature, and if someone can figure out how to spoof or copy that…Report

              • Off on a tangent… I fence with a guy who is a consulting forensic engineer. We can tell when he has a court date coming up because that’s all he thinks about, and his fencing goes to hell.Report

              • Oscar Gordon in reply to Michael Cain says:

                Related:

                https://www.schneier.com/blog/archives/2009/05/software_proble.html

                That’s more than a few years old. I wonder what the legal state is for such defenses is these days? Are breathalyzer companies forced to produce their source code?

                Something this talks about
                http://www.macworld.com/article/3036102/legal/apple-versus-fbi-faq-explainer.htmlReport

          • Congress would simply update telecommunications law to prevent companies from producing phones whose data the producers wouldn’t be able to access in respondent a court order

            Of course they will, because law enforcement won’t tell them the truth: back doors in commercial phones might let us catch some stupid potential terrorists.

            In exchange for which, American citizens will have to live with the risk that assorted other bad guys will get their hands on the back-door tools and freely access lots of innocent people’s data. For the record, I use “bad guys” broadly and include overzealous law enforcement and national security personnel.

            This is a technology war that the FBI, even with Congress’ backing, can’t win. The strong-encryption genie is out of the bottle. For about $100, paying quantity-one prices, I can put together a computer that runs a non-commercial OS, provides strong encryption of all user data. For connecting to the network anonymously, that little box can use random spoofed Ethernet addresses on any of the zillion public wifi hot spots. It’s perfectly capable of strongly-encrypted real-time voice communication. Anonymous drop boxes and meet-me exchanges for voice calls are straightforward. Smart terrorists can already communicate securely and laws requiring back doors in commercial products aren’t going to even slow them down.Report

            • Oscar Gordon in reply to Michael Cain says:

              This probably already exists, but if I was worried about the government getting my phone, I’d time bomb it. Have an app where if the phone has not been logged into with a valid authentication within 48 hours, it resets all the encryption keys, and, if possible, clears any cloud storage. Combine that with a “10 attempts” bomb.

              That gives the government 48 hours and 10 attempts to try and get into the phone, and after that, it’s all gone (unless the NSA can crack the encryption).Report

              • Glyph in reply to Oscar Gordon says:

                I can almost guarantee that this is easily-doable/is already being done.

                Which means that all our debates about torturing terrorists, over theoretical “ticking time bombs”, were relevant after all! 😉Report

          • Michael Drew in reply to Michael Drew says:

            I’m not necessarily saying it’s on balance desirable. Part of me would be concerned if they couldn’t; another part of me agrees with the sentiment of apprehension about there being no zone of absolute information security. But that’s a new possibility; we’ve been living in the space where, with a court order (and unfortunately often without), the government has been able to have access to just about everything. Obviously we need to shore up the warrant requirement, but I admit I’m a little concerned about entering the world where they don’t. But as I say, in this particular instance, either Apple is going to ultimately comply, or Congress will respond (crudely and excessively). I’m not necessarily fully pleased about that, but it does mean I’m not hugely concerned about the precedent of their not complying.

            Curious, does anyone dissent from what’s been expressed here – does anyone think that it would be good for there to be a way for investigators to get access to whatever information they need at times when they really need it, and a judge agrees?

            I’m sort of more raising the question of whether you’d rather have Apple allow the recent of complying epithet this order, or have Congress overreact to the percent of their not having to?Report

  6. Two questions:

    1. Who pays for the work Apple is being told to do?
    2. What happens if Apple’s end-game is to just say no?

    For 2, I’d guess the CEO is charged with contempt, and either jailed or fined (or both), but that’s purely a guess.Report

    • Michael Drew in reply to Mike Schilling says:

      The speculation is that Apple’s endgame, if necessary, is to change its citizenship.Report

    • Troublesome Frog in reply to Mike Schilling says:

      #1 was my first question. Is there any limit to how much cost we can demand that 3rd parties incur to help the government perform investigations?

      What if the company most qualified to break into the iPhone wasn’t Apple? Could that company be drafted and made to crack the phone in the national interest?Report

      • Marchmaine in reply to Troublesome Frog says:

        Have you learned nothing from the past 8 years? It is not a mandated cost, it is a tax.

        Fortunately it is a tax that you can avoid paying if you are willing to pay a fee based on a revenue calculus, which may or may not be more than the “tax”

        Personally, I’m waiting for the trickle down effect where I can calculate all my mandated Taxes every April 15, and then see if I can pay “fee” that is less than my Tax. Call it the Alternative Maximum Tax if you prefer.Report

        • Will Truman in reply to Marchmaine says:

          The magistrate required that Apple be compensated for its time and expense (by DoJ, I assume) No idea how that compensation ends up bring determined.Report

          • Do they get to bill for “loss of good will” from their customer base?Report

            • Glyph in reply to Michael Cain says:

              And “future damages, if/when this backdoor gets out into the wild”?Report

              • Mike Schilling in reply to Glyph says:

                “Uses of the term [backdoor] to refer to the order in this case are thus misleading” according to the Federalist guy. And he should know: he has no expertise whatsoever.Report

              • Glyph in reply to Mike Schilling says:

                I don’t either, really. But if that’s what sources I trust tell me it would be, then I’m going with it.

                At the end of the day, something built can always be used more than once; and something built to circumvent security, is a “backdoor”, so…

                We can’t on the one hand demand that tech companies make our machines and data ever more secure, while simultaneously expecting them to continuously-break that security. That doesn’t work. It’s a contradiction in concepts. There’s your burden right there.Report

              • Mike Schilling in reply to Glyph says:

                I have just enough expertise to know that I don’t know much. The Federalist jerk lacks even that.Report

          • PD Shaw in reply to Will Truman says:

            The order instructed Apple to advise the government of its “reasonable” cost of performing the services. It also instructed Apple to file a request for relief from the order some time this week if it believes compliance would be unreasonably burdensome.

            So, I think how this shakes out is that Apple complains that compliance would be burdensome and costly and provides an estimated figure. The ball would then be in the government’s court to agree to pay that amount. If it doesn’t agree, then it sounds burdensome. The government might agree to pay that up to that amount after the work is performed subject to a later judicial hearing on “reasonableness.”Report

          • Marchmaine in reply to Will Truman says:

            Yeah, the cost of the work has no actual bearing on whether they do it or not. I was just running with the flow.

            Though… the fact that the FBI reset the password (according to various reports) does add weird wrinkle to the request. Not from the main narrative: Phone Locked–>Apple Fix… but hmmn, makes one wonder if we’re not getting all the details of the story here.Report