The shocking and brutal attack by two terrorists on a San Bernardino County health department holiday party last year left 14 people dead, dozens wounded and countless Americans feeling more threatened than they have in years. Now, federal lawmakers are considering a bill that responds to one of the threats exposed by the San Bernardino shooters: the ability of terrorists and criminals to conceal evidence on their smartphones so effectively that even the phone manufacturer can't decrypt it.
Unfortunately, the draft bill by Sens. Dianne Feinstein (D-Calif.) and Richard Burr (R-N.C.), the leaders of the Senate intelligence committee, proposes a solution that would inevitably make Americans more vulnerable to hackers, identity thieves and other online predators. That's not to say that the increasing use of unbreakable encryption techniques isn't a problem for law enforcement — it clearly is. But lawmakers need to stop thinking of encryption as the enemy, and to start looking for other ways to identify threats and gather evidence.
At issue is what role tech companies such as Apple and Google should play in helping law enforcement agents extract information from the devices they build. Over the years, these companies have complied numerous times with court orders demanding suspects' data, including information stored on password-protected devices. In 2014, however, both companies changed the operating systems for their mobile devices to give users more protection against hackers and, in more repressive countries, government surveillance, while limiting their own ability to assist U.S. investigators.
Today, new Apple and Google devices automatically encrypt the data entered into them, and the companies don't hold back-up copies of the encryption keys. Only the person using the device knows how to unlock it. And to make the encryption more difficult to crack, Apple programmed its software to erase all a device's data if the password is entered incorrectly 10 times.
The FBI has tried in court to force Apple to make it easier for investigators to hack into the devices they've seized, and that legal battle is ongoing. But as Apple updates its software, it is making it increasingly difficult to remove the data protections it's providing. That trend has alarmed top law enforcement officials, who have pressed Congress to ensure that no evidence is beyond the reach of an investigator armed with a warrant.
To that end, the draft bill by Feinstein and Burr would require device makers, software developers and communications companies (including Internet service providers) to make sure they can decrypt any information that their technology encrypts. Apple and Google would also have to ensure that their app stores offered only products that met this requirement too.
On the plus side, the bill would not tell companies how to comply. Unlike previous, ill-conceived proposals, it would not mandate that companies store copies of their users' decryption keys, or that they build in back doors for investigators.
That's of little comfort, however, because the measure effectively mandates the use of privacy and security technologies that are weak enough to be defeated. Supporters say there's nothing inherently wrong with that approach; after all, banks, hospitals and other heavily regulated industries have to meet stringent privacy requirements while retaining the ability to disclose user data to investigators. But those businesses store data on closed, private networks that they can gird against attack; Apple and Google can't control how their users connect to the Internet, what sites they visit or even what software they load on their phones.
Barring legitimate tech companies from offering ever more secure products and services will only leave less tech-savvy Americans more vulnerable to the seemingly incessant attacks by malefactors online, while pushing more sophisticated users — and terrorists and criminals — to use strong encryption from foreign companies outside the reach of U.S. courts.
A better approach would be to clarify when courts may compel suspects to unlock devices or turn over decrypted data. More broadly, the government needs to work more closely with tech companies to develop new types of leads from the massive amount of data collected routinely from Internet users. With more devices connecting to the Internet every day, that data trove is growing even as encryption makes some potential evidence inaccessible to investigators. Policymakers should focus on what's available without encryption rather than weakening the security of Americans' devices.