That computer you carry round in your pocket is a treasure trove of data, ripe for the hacker to swoop in and steal your identity. Fortunately, today’s smartphones come with encryption by default, which obscures this data and makes it harder for sources to get inside. It’s a clever bit of technology, but one that has its detractors.
How It Works
When most people discuss “cell phone encryption,” it’s about the whole device data. But there are other types to be aware of — WhatsApp and Telegram are just two apps that offer encrypted messaging, meaning hackers can’t read the messages transmitted between phones.
With device encryption, stored data is scrambled and unreadable to others. It’s only when the passcode is entered that the data is revealed. This is why your phone may not connect to previous secure Wi-fi networks until the password is entered. It also means that someone couldn’t take the chips out of your switched-off phone to read the data.
Each iPhone ships with an encryption key that’s 256 bits in length. It’s not actually stored somewhere — the phone combines the inputted passcode with data in the Secure Enclave chip to generate the key. This chip is also where fingerprint data and Apple Pay credit card information is kept. When a user puts in their passcode after restarting the phone, the phone unlocks and decrypts the device data at the same time. The iPhone will block repeated guesses (and on some phones wipe the data) to stop hackers from trying to gain access.
Android is slightly more complicated. Thanks to a variety of smartphones with different levels of speed, Google did not require manufacturers to turn on encryption by default until version 6.0 Marshmallow. These days, if a phone ships with that version or later, the phone must use encryption out of the box.
Google’s implementation varies depending on the manufacturer. Some phones will use a key generation system similar to Apple, which depends on the phone asking for the passcode during the switch-on process to access any information. Others will use something more complex called file-based encryption, which allows for varying levels of decryption and means some files like alarms are accessible before the passcode is inputted.
The switch to mandatory encryption has been a success. A May report showed that 80 percent of Android Nougat users are running fully-encrypted devices.
Why It’s Controversial
It’s been a source of contention with law enforcement agencies, who argue that smartphone makers should provide “backdoors” to allow access. In 2015, Apple’s encryption made headlines when the Federal Bureau of Investigation failed to gain access to the San Bernardino shooter’s iPhone 5C. This month, officials announced they were unable to access the phone of Devin Patrick Kelley, who killed 26 in a Texas church massacre.
“Law enforcement, whether at the state, local or federal level, is increasingly not able to get into these phones,” Christopher Combs, the FBI special agent in charge of the investigation, told reporters at a press conference.
Tim Cook, Apple’s CEO, has strongly defended the use of cell phone encryption on devices, citing consumer privacy as a big concern.
“In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes,” Cook said regarding the San Bernardino case. “No reasonable person would find that acceptable.”
Is It Perfect?
Apple refused to comply with the agency over the San Bernardino case. The agency filed a Department of Justice case against the company, but then dropped it after paying hackers to gain access to the phone. That suggests the iPhone 5C is hackable, but without analysis of the tool, it’s unclear whether it affects other cell phone brands.
Like any form of security, it’s important to assume that encryption is not foolproof, and to take extra steps to keep data safe.