FBI Chief Concedes Broadness of iPhone Unlock Request (Law360): According to a Law 360 article, written by Daniel Wilson, FBI Director James Comey acknowledged that the order directing Apple to develop software to enable access to an iPhone 5c could allow potential future use should the software find its way outside of Apple.
As this blog and many others have already covered, Apple was ordered by a Federal Magistrate Judge to assist the FBI under the All Writs Act (28 U.S.C. § 1651); however between briefs and reply briefs, the hearing on this order will not take place until March 22.
Among other defenses, Apple is asserting that the creation of such a backdoor tool will introduce grave risks for the security of iPhone users and that the tool may be used outside the specific case at hand, according to Wilson’s article. During questioning on Tuesday by Rep. John Conyers (D-Mich.), Comey iterated that the final ruling in this pending case could be used to form a precedent for future cases which would allow the FBI to use the All Writs Act to compel future development efforts to access encrypted iOS devices, according to Wilson.
Commentary
Apple’s assertions are the typical slippery slope, descent into anarchy, watch out as we move towards Dante’s Nine Circles of Hell, Chicken Little — the Sky is Falling type arguments. Essentially what Apple is saying is that, yes, we could write software that will remove the timing delay issue between subsequent passcode entries, yes, we can remove the data delete feature that erases all content on an iOS device after N unsuccessful passcode entry attempts, but because only we can do that because only Apple developers have the insider expertise and access to the code to effect such a workaround, someone may get access to this and then the world will fall into chaos and eternal damnation. The fiction that Apple purports seems to be that they have a death grip on their proprietary operating system and no one can bypass it so if they modify their code and then use that same code to enable access to iOS data then the mere act of developing the code would essentially let the genie out of the bottle and life will never be the same.
First of all, the iPhone 5c is an older generation iPhone and does not incorporate either the A7 (A-series) processor nor does it use TouchID which works in conjunction with the Secure Enclave, according to the Apple iOS security guide. Consequently, the concerns about this software code being utilized outside the iPphone 5c class of devices are likely unwarranted. Whereas, devices based on the Apple A7 or later A-series processors have a co-processor fabricated within the processor referred to as the secure enclave, the 5c does not. The later iPhone models use a coprocessor that leverages encrypted memory and is provisioned during the manufacturing process with a Unique ID (“UID”) that is not known to either Apple nor to other system components. Consequently, when an iOS device with a secure enclave starts up a key is created that intertwines the UID and thereby encrypts the device memory space. While data on the iOS file system is encrypted by the Secure Enclave using the UID and an anti-replay counter (“nonce”) to form an encryption key. Additionally, communication between the Secure Enclave and the touch ID sensor utilizes 2-way AES encryption and data sent from the processor to the Secure Enclave cannot be read by the processor, according to the iOS security guide.
Similarly, since the UID is burned into the silicon and is unknown to Apple, there exists no mechanism whereby Apple could discern the UID and thus the only entry point into iOS encrypted data requires the use of brute force. Here too, the issue becomes the fact that the data storage model relies on access via the processor and everything ties back to the UID.
One of the biggest issues with gaining access to the encrypted data on an iOS device is the way the devices are purpose-built with security in mind. For instance, file access passes keys through the secure enclave thus, it isn’t as simple as pulling a hard drive out of a server and then using brute force tools to decrypt the drive. Here, files are accessed only through the secure enclave and thus, decryption must take place at this level and the speed and access are thereby constrained by the processor and the secure enclave. Additionally, the secure enclave performs the key checks and initiates the delays (for instance for successive failed passcode attempts).
The short version: if Apple modifies their code such that the iPhone 5c passcode delays and auto-delete functions are impaired and while in physical possession of the iPhone 5c does load this iOS version on the iPhone and then deliver the iPhone to the government to allow them to attempt a brute-force attack upon the iPhone, the implications to other devices are minimal. To be sure, a future government intrusion using this would require:
- probable cause sufficient such that a Judge would order the lawful search and seizure of said device;
- physical possession of an iOS device that lacks the secure enclave;
- loading code on the specific iOS device;
- allowing government to attempt a brute-force hack on the device
This is not analogous to a backdoor wherein Apple or any other provider would provide code that actually allows someone to access encrypted data using a master key or something similar. All the Government is requesting is that Apple modifies their iOS code on this specific iPhone to allow:
- a) bypass or disable the auto-erase function;
- b) enable FBI to submit passcodes to the subject device via the physical device port, Bluetooth, Wi-Fi, or another available protocol;
- c) remove the delay between successive passcode attempts
So, it seems the slippery slope argument is not wholly compelling and in any case, Apple would be writing the code, deploying the code, and maintaining the code and anytime a lawful mandate is issued directing Apple to deploy that code Apple has available legal remedies in which to resist an order.
So how about it Apple, are you really worried that writing code to bypass a previous generation iPhone to allow a brute force attack is going to expose millions of user’s iPhones to government intrusion into their encrypted data? If so, then perhaps the iOS security guide is not as accurate as it is purported to be and your devices aren’t as secure as we believe which would explain the reluctance to pull back the curtain, even if only a little.
emmitc01
First, the request was initially made for a drug dealer’s phone (Feng) and then it was revealed that there were 12 other similar requests, including the San Bernandino terrorist’s phone. While there are valid reasons for the government to want access to the phone, that does not mean that there are valid legal arguments for Apple to develop new software to provide that access. We have no idea what level of crimes the other requests were for. How does the government decide what is worthy of bypassing encryption? I imagine that those thresholds would slip as time wore on.
Second, you make it seem as if the fact that it only affects one model of phone and thus a small population (which is most likely a fallacy, we can only assume that it could be applied to all iphones prior to the iPhone 6 series). There were 2.6 milllion iphone 5c sold, 10.75 milllion iphone 5s sold (Feng’s model), with a total of 91 million iphone 5 phones sold overall. We are not talking a trivial number of devices before adding in all previous models.
Third, if they comply with this, they are going to be expected to continue to comply in the future. Despite newer models having better security, they will be expected to continue finding ways to allow the government access to the phones. Compliance would also open the door for other sovereign nations to attempt to compel Apple to provide the technology they provide the US government. Apple would be especially vulnerable to China, where their products are manufactured.
Fourth, you argue that Apple would create and maintain the code and that government agencies would come to them any time they need to unlock the device. How exactly do you propose that would work? Once the software has been installed on the target device, are they expected to remove it? I don’t read that in the government’s arguments at all. They want the bypass software on the phone and then they will brute force the phone on their own time. Do you honestly believe the government would come back to Apple for each phone or would they reverse engineer the software that Apple placed on the original phone?
Last, you make the argument that just because the software is developed doesn’t mean that it will spread and be the equivalent of the genie being let out of the bottle. Once software is used, it is vulnerable to being spread, modified and re-used for potentially nefarious purposes. Stuxnet was used originally in Iran, but has since popped up elsewhere and was modified for other purposes. The FBI and DOJ have suffered breaches in the last few months, so to think that they can protect the software they could possibly receive from Apple is naïve at best. If you think the hacking attempts against the federal government are high now, wait until Apple has complied and the FBI has the Holy Grail sitting on their servers.
In fact, the Amici brief filed by the Federal Law Enforcement Officers Association has already referenced ios 8 and has made the gloom and doom argument that without access to iphones and ipads, they can’t solve crimes. While access to devices may make investigations easier, so would having free access to a suspect’s house or car or office. I agree with Bill Snyder’s assertion that this particular case is not a 4th amendment case. There was a valid search warrant in the Feng case that allowed for his phone to be searched, but the government was not able to do so without Apple’s assistance.
The root of the issue is can the government compel a private entity to write software that is counter to their (and their customers’) interests to further an investigation? Apple builds, markets, and sells their devices with security as a prime consideration. If they comply with this court order, they are operating against their own self-interest and they may end up suffering financially for it. In fact, other companies may choose not to build robust security into their devices because they may expect the government to require them to bypass it at some point in the future. Should the government have the absolute right of access to a device at the expense of individuals’ privacy rights? If I have done nothing wrong, should the government’s desire to access a bad actor’s device make my device less secure?
emmitc01
First, the request was initially made for a drug dealer’s phone (Feng) and then it was revealed that there were 12 other similar requests, including the San Bernandino terrorist’s phone. While there are valid reasons for the government to want access to the phone, that does not mean that there are valid legal arguments for Apple to develop new software to provide that access. We have no idea what level of crimes the other requests were for. How does the government decide what is worthy of bypassing encryption? I imagine that those thresholds would slip as time wore on.
Second, you make it seem as if the fact that it only affects one model of phone and thus a small population (which is most likely a fallacy, we can only assume that it could be applied to all iphones prior to the iPhone 6 series). There were 2.6 milllion iphone 5c sold, 10.75 milllion iphone 5s sold (Feng’s model), with a total of 91 million iphone 5 phones sold overall. We are not talking a trivial number of devices before adding in all previous models.
Third, if they comply with this, they are going to be expected to continue to comply in the future. Despite newer models having better security, they will be expected to continue finding ways to allow the government access to the phones. Compliance would also open the door for other sovereign nations to attempt to compel Apple to provide the technology they provide the US government. Apple would be especially vulnerable to China, where their products are manufactured.
Fourth, you argue that Apple would create and maintain the code and that government agencies would come to them any time they need to unlock the device. How exactly do you propose that would work? Once the software has been installed on the target device, are they expected to remove it? I don’t read that in the government’s arguments at all. They want the bypass software on the phone and then they will brute force the phone on their own time. Do you honestly believe the government would come back to Apple for each phone or would they reverse engineer the software that Apple placed on the original phone?
Last, you make the argument that just because the software is developed doesn’t mean that it will spread and be the equivalent of the genie being let out of the bottle. Once software is used, it is vulnerable to being spread, modified and re-used for potentially nefarious purposes. Stuxnet was used originally in Iran, but has since popped up elsewhere and was modified for other purposes. The FBI and DOJ have suffered breaches in the last few months, so to think that they can protect the software they could possibly receive from Apple is naïve at best. If you think the hacking attempts against the federal government are high now, wait until Apple has complied and the FBI has the Holy Grail sitting on their servers.
In fact, the Amici brief filed by the Federal Law Enforcement Officers Association has already referenced ios 8 and has made the gloom and doom argument that without access to iphones and ipads, they can’t solve crimes. While access to devices may make investigations easier, so would having free access to a suspect’s house or car or office. I agree with Bill Snyder’s assertion that this particular case is not a 4th amendment case. There was a valid search warrant in the Feng case that allowed for his phone to be searched, but the government was not able to do so without Apple’s assistance.
The root of the issue is can the government compel a private entity to write software that is counter to their (and their customers’) interests to further an investigation? Apple builds, markets, and sells their devices with security as a prime consideration. If they comply with this court order, they are operating against their own self-interest and they may end up suffering financially for it. In fact, other companies may choose not to build robust security into their devices because they may expect the government to require them to bypass it at some point in the future. Should the government have the absolute right of access to a device at the expense of individuals’ privacy rights? If I have done nothing wrong, should the government’s desire to access a bad actor’s device make my device less secure?
emmitc01
First, the request was initially made for a drug dealer’s phone (Feng) and then it was revealed that there were 12 other similar requests, including the San Bernandino terrorist’s phone. While there are valid reasons for the government to want access to the phone, that does not mean that there are valid legal arguments for Apple to develop new software to provide that access. We have no idea what level of crimes the other requests were for. How does the government decide what is worthy of bypassing encryption? I imagine that those thresholds would slip as time wore on.
Second, you make it seem as if the fact that it only affects one model of phone and thus a small population (which is most likely a fallacy, we can only assume that it could be applied to all iphones prior to the iPhone 6 series). There were 2.6 milllion iphone 5c sold, 10.75 milllion iphone 5s sold (Feng’s model), with a total of 91 million iphone 5 phones sold overall. We are not talking a trivial number of devices before adding in all previous models.
Third, if they comply with this, they are going to be expected to continue to comply in the future. Despite newer models having better security, they will be expected to continue finding ways to allow the government access to the phones. Compliance would also open the door for other sovereign nations to attempt to compel Apple to provide the technology they provide the US government. Apple would be especially vulnerable to China, where their products are manufactured.
Fourth, you argue that Apple would create and maintain the code and that government agencies would come to them any time they need to unlock the device. How exactly do you propose that would work? Once the software has been installed on the target device, are they expected to remove it? I don’t read that in the government’s arguments at all. They want the bypass software on the phone and then they will brute force the phone on their own time. Do you honestly believe the government would come back to Apple for each phone or would they reverse engineer the software that Apple placed on the original phone?
Last, you make the argument that just because the software is developed doesn’t mean that it will spread and be the equivalent of the genie being let out of the bottle. Once software is used, it is vulnerable to being spread, modified and re-used for potentially nefarious purposes. Stuxnet was used originally in Iran, but has since popped up elsewhere and was modified for other purposes. The FBI and DOJ have suffered breaches in the last few months, so to think that they can protect the software they could possibly receive from Apple is naïve at best. If you think the hacking attempts against the federal government are high now, wait until Apple has complied and the FBI has the Holy Grail sitting on their servers.
In fact, the Amici brief filed by the Federal Law Enforcement Officers Association has already referenced ios 8 and has made the gloom and doom argument that without access to iphones and ipads, they can’t solve crimes. While access to devices may make investigations easier, so would having free access to a suspect’s house or car or office. I agree with Bill Snyder’s assertion that this particular case is not a 4th amendment case. There was a valid search warrant in the Feng case that allowed for his phone to be searched, but the government was not able to do so without Apple’s assistance.
The root of the issue is can the government compel a private entity to write software that is counter to their (and their customers’) interests to further an investigation? Apple builds, markets, and sells their devices with security as a prime consideration. If they comply with this court order, they are operating against their own self-interest and they may end up suffering financially for it. In fact, other companies may choose not to build robust security into their devices because they may expect the government to require them to bypass it at some point in the future. Should the government have the absolute right of access to a device at the expense of individuals’ privacy rights? If I have done nothing wrong, should the government’s desire to access a bad actor’s device make my device less secure?