Ethan's Personal Website
Apple & The FBI

On February 16, 2016, a federal judge ordered that Apple sign a binary update that would be applied to a suspected killer's iPhone 5C which would bypass the security feature that imposes a limitation on the number of guesses in a particular amount of time, adding further fuel to the debate over encryption.

The behind-the-scenes of iPhone encryption

The process to turn a passcode into an AES-256 key is quite complex, involving the interaction with several hardware components in addition operations in software. Although newer devices use a process wherein cryptographic secrets never leave a coprocessor known as the Secure Enclave, the iPhone 5C (the one relevant in this case) is the last iPhone not to include such a coprocessor.

During manufacturing, a 256-bit random secret will be input into the processor via blown fuses; this is known as the UID. When a passcode is entered, the UID is combined with the passcode via PBKDF2 with an iteration count that takes approximately 80 milliseconds on the device. On more recent models, this is performed on the Secure Enclave, but this is not the case on the 5C. This key is then used, via several layers of indirection, to encrypt the contents of the hard drive.

In the Settings app of any iOS device, one can enable a setting that will wipe the device after ten incorrect passcodes. It is not clear what exactly will be "wiped", although it seems as if this would necessarily include the UID. In the case of this particular court order, this setting is enabled.

Additionally, all iPhone updates are digitally signed; they will be rejected if the signature is invalid.

The court order is a request to remove the 10 guesses limit via a specialized software update to that particular iPhone.

Why this matters

As is stated in section 2 of the court order:

Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-earase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE; and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond that which is incurred by Apple's hardware.

The court suggests the following procedure for compliance (section three of the court order, emphasis mine):

Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory ("RAM") and will not modify the iOS on the actual phone, the user data partition or the system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE either at a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

It does allow for solutions other than that proposed above. However, in the absence of security-critical bugs in the software, which it appears that there are not (as otherwise the FBI would simply have used that bug), Apple would necessarily have sign a software binary to fulfill the court order; this is effectively a compulsion of speech, and negates much of the value of digital signatures.

A digital signature is in many ways the same as a physical signature; it is extremely difficult to forge; it can only be produced by the owner of the key, as a physical signature can only be produced by its owner; why should it not enjoy the same protections from compulsion?

Beyond the issue with digital signatures, it is important to realize that Apple is being compelled to circumvent their own security measures, effectively negating the value of digital security as a whole. If I put a lock on my door, it only has value insofar as no-one else, even the manufacturer of the lock, has the ability to unlock it. A digital security scheme only has value insofar as no-one else, even the manufacturer of the scheme, has the ability to circumvent it.

True security is the inability of anyone to break it, not the inability of particular individuals to break it. Protection of national security is useless in the absence of strong guarantees of personal security; why am I required to give up my rights to personal security for the fleeting goal of security for a nation that I may leave at any moment?

A year ago, a world devoid of privacy, of adequate protection from unauthorized technological action, seemed far-fetched; as if it may never happen. It seemed faint as a cold winter night on a warm day in summer. Now, it has become clearer; as if it is possible, perhaps probable; is that the world we want to create? For it is the one to which our path leads; how long until it is too late to turn back?

References

  1. iOS 8.1 security guide - only available from Internet Archive.
  2. Order compelling Apple, Inc. to assist agents in search
  3. A Message To Our Customers - This is the original statement from Apple.

I would also like to thank Sundar Pichai of Google and Jan Koum of WhatsApp for backing up Apple. Here are their tweets (Pichai tweeted his statement in 5 parts):

1/5 Important post by @tim_cook. Forcing companies to enable hacking could compromise users privacy.
2/5 We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism
3/5 We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders
4/5 But thats wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent
5/5 Looking forward to a thoughtful and open discussion on this important issue

Sundar Pichai, CEO, Google

I have always admired Tim Cook for his stance on privacy and Apple's efforts to protect user data and couldn't agree more with everything said in their Customer Letter today. We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake.

Jan Koum, CEO, WhatsApp

Update March 4th

The EFF has filed an amicas curae brief using the digital signature speech-compulsion argument I used above. Interesting read.