The smartphone security turf war between Apple and the FBI just got way more interesting: The Department of Justice today threw its weight into the matter with a formal motion that the Apple develop a “backdoor” to circumvent an iPhone’s security measures.
If you’re just tuning in, here’s the recap: Officials recovered the iPhone 5c belonging to one of the San Bernardino shooters, and in the interest of examining the device for further information potentially related to acts of terror, the FBI petitioned Apple to develop a means to more conveniently get around the phone’s passcode lock.
Apple staunchly declined to do so, issuing a press release dressed up as an open letter to its customers explaining the importance of encryption and arguing that such an FBI-mandated solution could be abused to gain access to other phones, regardless of wrongdoing.
Now Apple must contend with a new wrinkle in what is undoubtedly a terrible, horrible, no-good, very bad week for CEO Tim Cook. The Department of Justice issued a legally binding motion that the company comply with the federal government’s wishes and make good on delivering the circumvention. Once you read through the motion’s noxious legalese, it becomes scintillatingly clear that no “privacy is important” argument will dissuade the government’s wishes. It wants inside that phone.
The DoJ’s motion cites assorted past federal cases that point to the legal rights of a court to issue supplemental orders to third parties to facilitate the execution of search warrants, whether those parties like it or not. It argues that Apple is not “far removed” from the matter at all.
The company perhaps unwisely publicly rebuked the FBI’s wishes, and the government is not likely to respond positively to a slapped face on anything that can be even indirectly tied to terrorism.
The motion says that Apple’s assistance is absolutely essential, and that it places no unreasonable burden on the company to require it to design a method to break the phone’s passcode security.
The Justice Department interprets Apple’s concerns over ensuring the security of all of its devices as “marketing concerns,” and goes on to argue that “public policy favors enforcing of the order” to get around the iPhone’s passcode.
“A dangerous precedent”
Unsurprisingly, tech and security pundits have some strong feelings on the ordeal. Danny Boice, the CEO of Trustify, a company perhaps best understood as “Uber for private investigators,” throws his hat into Apple’s ring. Introducing security backdoors into one-way encryption is just asking for trouble. There’s no way just to do this on the terrorist’s phone based on the way phone security works. Apple would have to do it on every single iPhone. The issue at hand is much bigger — it sets a dangerous precedent for allowing the government to tell tech companies to lessen security at their whim,” he says.
Standing immediately counter to this argument is Dan Guido, the head of New York City-based cybersecurity research firm Trail of Bits. In a blog post published before the DoJ motion was issued, Guido outlines in great detail how Apple can comply with this order without threatening the security of the rest of its devices.
“The FBI does not have the secret keys”
“In plain English, the FBI wants Apple to create a special version of iOS that only works on the one iPhone they have recovered,” he writes. “This customized version of iOS (ahem FBiOS) will ignore passcode entry delays, will not erase the device after any number of incorrect attempts, and will allow the FBI to hook up an external device to facilitate guessing the passcode. The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus. As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.”
However this ultimately plays out, know that this is the definitive tech story of the week. It points directly to the growing friction between free, unencumbered communication in a technological society and the government’s role to enforce the law.
Let’s see how it goes.
If you liked this article, check out this video: "Apple FaceID Doesn't Work at First-Ever Demo"