We <3  (Apple supports right to privacy!)

Last week a federal court ordered Apple, Inc. — the maker of iPhones, iPads, and everything “cool” in technology — to help the FBI decrypt an iPhone 5c used by one of the San Bernadino shooters. Apple contends that it does not have this technology and that it would be dangerous to create it, lest it end up in the wrong hands.

Before diving into the intricacies of the case, here are the facts:

1) The FBI has all of the suspect’s communication records — who they talked to and how. These are stored by the phone service providers (AT&T, Verizon, etc.).

2) The FBI received comprehensive backups of all the suspect’s data until just 6 weeks before the crime. Automatic backups stopped either because a) the phone was never plugged in while connected to a known wi-fi network or b) because the suspect turned off backups.

3) Copies of the suspect’s contacts with co-workers are available from the co-workers’ phones.

4) The phone is a government-issued work phone, subject to consent-to-monitoring. The phones believed to be hiding incriminating information were recovered by the FBI and physically destroyed.

Apple has helped law enforcement in the past. In fact it’s required to when issued a legitimate warrant. These cases involve iPhone Operating System (iOS) 7.0 or lower.  Prior to 2014, an iPhone was encrypted based on the device ID; the encryption key was the device’s ID number, effectively making each device vulnerable to hacking. Starting with iOS 8.0 in 2014, Apple changed its encryption process. Instead of encrypting each phone using the device ID number, the software encrypts itself based on the password or Touch ID (the fingerprint). This means the encryption key can be changed at the whim of the owner, but also that Apple no longer has access to the encryption key. Only those who know the password can provide the decryption key for the phone.

These changes came after whistleblower Edward Snowden came forward about the data mining techniques of the National Security Administration (NSA). Concerned about data collection, Apple added a more secure way of protecting your privacy.

Not satisfied with the information it has, the FBI wants Apple to create a program to hack into this one, single phone. The problem is this program doesn’t exist, and Apple does not want to make it. Review, again, the data that the FBI has. The only thing that the FBI does not have is six weeks worth of backups on the non-secret work phone (not the one with all the terrorist information on it) and any other information on the phone.

At the request of the FBI, San Bernadino county workers reset the Apple ID password on the phone, and — in a twist of fate — that action made it impossible for the phone to backup to iCloud* and for Apple to recover the information. (*If, indeed, the last six weeks of backups were not lost due to turning backups off).

So here’s the problem in a nutshell: The FBI wants a court to order Apple to create software that doesn’t already exist to extract information from a phone that probably doesn’t have any new information to aid the Bureau, after the FBI ruined its only chance to extract the information it wants.

In essence, the FBI wants Apple to risk the privacy and security of every American in order to extract likely impertinent information from a work phone.

When the government has the possibility to bully the world’s most valuable corporation, you know it’s too big.

Part II will follow, covering the implications and “Pandora’s Box” of creating an iPhone backdoor.