Crossroads Blog | CYBER SECURITY LAW AND POLICY

AppleVsFBI, encryption

Apple Encryption Debate: What about iCloud?

 Mossberg: The iCloud loophole (TheVerge): Walt Mossberg’s article highlights the fact that Apple has the ability to decrypt the bulk of data that is uploaded via iCloud backups.  Furthermore, Apple has unencrypted and provided iCloud backup data to both the FBI as well as other law enforcement agencies on numerous occasions (once a valid warrant has been issued), according to this article.  This article indicates that Apple views iCloud data differently from the iPhone for a variety of reasons:

  • Apple claims that the security policies for the phone relate to a physical object which can, therefore, be lost or misappropriated and consequently the physical device requires heightened security protocols;
  • Apple indicates that the iCloud requires strong security, however, Apple retains the ability to access and restore backups to user devices since this is a feature that users desire. Additionally, Apple states that sensitive data such as, network passwords, Apple keychains (which holds passwords), is not decrypted from iCloud backups.
  • Apple’s position is similar to other providers, such as Google (Gmail, Drive, Docs, and Calendar), and Dropbox. Both of these services indicate that they comply with valid, lawful orders for decrypting and providing data to law enforcement.

The full text of the article is here.


Commentary [Editor’s Opinion]

This article raises some interesting questions concerning exactly what Apple was doing when it launched its media blitz decrying the government’s efforts to compel Apple to bypass some iOS security features that would allow the FBI to launch a brute-force attack on an iPhone 5c.  If Apple’s primary motivators surround data privacy and protection for its customers, then why does it retain the ability to decrypt iPhone backups?  Did Apple choose this battle merely to highlight what it deems to be a larger privacy issue or does Apple truly believe that data on an iPhone is more sensitive than data from an iPhone backed up to iCloud?

Before proceeding, I should say that on a personal level, irrespective of the position I may have extolled in previous blog posts I think that data privacy and encryption, in particular, are valuable tools that should be available to citizens within the digital realm.  Specifically, I am not in favor of encryption backdoors, master keys, or “clipper-style” chips that would allow government intrusion into electronic communications.  I use encryption at the volume and file-level, and I believe that just because the government has a search warrant giving them the right to access information doesn’t mean that they necessarily have (or should have) the ability to access encrypted information.

That being said, it seems a bit disingenuous to argue that modifying the iOS code to remove the timing delay between successive passcode unlock attempts, and to bypass the auto-delete functionality so the Government could launch a brute-force attack against an iPhone somehow places user data in greater jeopardy than putting a bow around a decrypted iCloud backup and delivering it to the Government.  Frankly, it seems shocking that more users aren’t distraught by Apple’s past and seemingly future compliance with requests for decryption of iCloud backups.

A number of arguments have been raised with respect to why the Apple vs. FBI issues are so important and far-reaching.  Here are some points that appeared in recent comments to a previous post:

  • There may be valid reasons for the Government to request access to an iPhone, but how is that threshold discerned?
  • The actual number of phones that could be affected, number in the millions with presumably any model 5c or earlier able to be brute-force attacked had Apple developed this iOS code;
  • Initial compliance will lead to later compliance and companies such as Apple will be compelled to comply, especially in countries such as China;
  • Once Apple writes the software the government can reverse-engineer it and they will be able to use it to unlock other phones;
  • FBI and DOJ have both suffered breaches so if they have the iOS software it is likely to attract hackers and they will effect a breach and abscond with the iOS code

With respect to the first point, that is and will continue to be a matter for the Judiciary. Once an application has been made and issued by the court, it becomes a lawful mandate and yes, Apple or any other entity, or person is required to comply.  There is no distinction between encrypted or unencrypted data or between levels of encryption, or the use of bit shifting, or steganography, it is simply a lawful order that gives Law enforcement the ability to get X from Y.  Additionally, many search warrants have ex ante restrictions that limit law enforcement’s processes, procedures, and/or timelines within which they can execute a search. Thus, there are already mechanisms in place to ensure that Law enforcement has valid reasons to request access to data.

The second point, while valid still overlooks the fact that the iOS changes being touted would be purpose-built to load on this specific iPhone, not just the specific model, but, in fact, the specific device associated with a unique device identifier.

The third point is little more than a slippery slope argument. The mere fact that a company is forced to comply with a lawful order does not render any future arguments against a DOJ request to be moot.  These inquiries are very fact-specific and as such one would anticipate that court’s are going to make the requisite searching inquiries before compelling any action under the All Writs Act.  Additionally, under the All Writs Act, the following conditions must be examined [1]:

  • Is Apple either a party to the underlying case or if a non-party are they in a position to either thwart or effect the implementation of the court order? Here, Apple does not own the phone, however, it did manufacture the device.  Furthermore, Apple owns the proprietary design elements to include hardware and software and is, therefore, a party.  Even if one were to argue that Apple’s non-possessory interest in the specific device was at issue, the fact that Apple does own the iOS running on the device and it is a combination of the iOS and the underlying hardware that is preventing the DOJ’s brute-force attack without modifications by Apple, then they are a non-party to whom it would be appropriate to direct the writ;
  • Does Apple have a substantial interest in not assisting the government? The stated interest appears to be Apple’s strong beliefs in privacy rights and at face value that does seem to be compelling. However, when taken in the context of Apple’s position on iCloud backup files which it readily decrypts when provided lawful mandates, the argument weakens.  The fact that Apple views device security differently than the security of files backed up to the iCloud lends credence to the theory that Apple’s desire to “seem” focused on security and privacy really isn’t the case in their day-to-day operations.  The fact that Apple does decrypt customer data indicates that the idea of doing so is not patently offensive, nor does it violate the company’s actual beliefs or policies (irrespective of which beliefs Apple chooses to assert with the media).
  • Is the order burdensome? If this code change requires two weeks of coding by a team of developers, then certainly there is the opportunity cost associated with this.  This team of developers could have spent two weeks working on any number of issues or building the greatest iOS the world has ever seen.  However, since the government is willing to compensate for the time devoted to this endeavor, one can also argue that while the time and potential products in the Software development life cycle (“SDLC”) may be impacted, this is a burden which can be shifted through the allocation of government funds to offset the time and expense.  In reality, this is probably Apple’s strongest point, yet it also seems to be the one they are putting the least amount of focus on.  If you think of the SDLC in terms of the butterfly effect, where the flapping of the butterfly wings at Time N causes the breeze that causes the ripple that cascades to Time N+n into a hurricane, then you get a sense of the argument Apple might make.  If you assume that iOS runs on a 6 month SDLC, then the devotion of two weeks of core development resources to assist the government then the entire life-cycle shifts and now software is out of sync with hardware and the new release slated to be rolled out in two more development cycles suddenly gets pushed back and it extends the time to market and allows someone else to gain a competitive advantage and suddenly you can demonstrate the enormity of the potential burden of shifting development resources to an outside project while in the midst of the SDLC.
  • Is there a way for the government to obtain what it needs without Apple’s assistance? Well up until Sunday this answer seemed pretty straightforward.  According to Apple, their iPhone was secure, and according to the DOJ, they could not bypass the lock code security.  Of course, once an outside party was able to bypass this and unlock the iPhone this task was not dependent upon Apple’s acquiescence.

With respect to the fourth point, this is actually somewhat counterintuitive.  Here it is being asserted that once Apple modified the iOS to change the timeout value and bypass the auto delete then if the government gets the device back they can reverse-engineer the iOS and use it to allow them to brute-force attack other iPhones. The problem being, if the government could reverse-engineer this “special” iOS then why can’t they reverse-engineer the proposed iOS?  Does changing a timeout value and circumventing a block of code that performs the auto delete somehow make the code easier to reverse engineer?  Apple may be overstating the resources it would take to build this new code, but either way, if the government can reverse engineer iOS then all bets are off and they can hire some script kiddies to make some modifications and let it run amok to access iPhones everywhere (with valid court orders, of course).  Unless there is some reason that the new iOS is going to be inherently insecure or if the update will only work with raw source code, the propensity of the government to reverse engineer the new code is no greater than their ability to do the same with the current iOS versions.

Finally, the last point with respect to making the FBI and DOJ a target for hackers once they have this new iOS is also rebuttable.  As I write this, presumably iOS is sitting on servers in Cupertino, if it really is that attractive why isn’t Apple a target for this sole reason?  Furthermore, while I won’t argue that the government has a handle on cybersecurity, in all fairness can we assume that the FBI and the DOJ already have some sensitive data on their servers that hackers the world over would love to get access to, so aren’t they already targets as well?  Some of this then comes back to Apple’s assertion that iCloud and iPhones aren’t really the same.  Here, if either the FBI, or any other three-letter agency was able to get this iOS, reverse engineer it, and have it waiting, and ready for use, they would still need the physical phone to make this work.  So too would any hacker, having the ability to brute-force attack a phone to get it to unlock without erasing the data is somewhat predicated on actually having a phone to perform this on.  In the absence of a physical device, this would not be a very useful exercise.

Is it scary that the government could access our phone data and our encrypted communications? Yes, of course, that raises a number of concerns with respect to privacy as well as potential freedom of speech issues.  However, if you consider the movement towards the Internet of Things, and the sheer volume of devices and data, is this really a looming concern for most of us?  Perhaps, this is the result of the participation-generation where everyone that shows up gets a trophy.  Perhaps these same people think that their dog and cat pictures or their status updates “OMG, eating a real hot dog from a street vendor while on Spring Break” are somehow of interest to the federal government who wants nothing more to break their encrypted communications to find out how many cats they say they have vs. how many they actually have.  However, in the real world, and having spent some time with Big Data firms, one realizes that trying to amass all of that information and then taking the time to decrypt communications and then sorting through them (even if automated based on keyword) is an enormous undertaking and would be both technically infeasible and an inordinate drain on government resources.  Sorry to disappoint, but for most of us, Big Brother is more akin to Rhett Butler than George Orwell and when it comes to most of our information “frankly my dear, [the government does not] give a damn.”

[1] United States v. N.Y. Tel. Co., 434 U.S. 159, 174 (1977).

Leave a Reply