The tussle between Apple and the Federal Bureau of Investigation ratcheted up a notch Thursday, with Apple telling a federal magistrate that she violated the company's constitutional rights by ordering it to write software that would enable the FBI to hack into a terrorist's locked iPhone. The tech Goliath is defending the right principle here: that law enforcement agencies shouldn't be able to compel companies to weaken data protections in a way that could put all their customers at risk. But there are legitimate interests on both sides, and the courts aren't the right venue to balance them. That's Congress' job.
That's why it makes sense for Congress to bring together experts from the worlds of tech and law enforcement to search for an alternative to orders such as the one Apple is fighting, as House Homeland Committee Chairman Michael McCaul (R-Texas) and Sen. Mark Warner (D-Va.), a member of the Senate Intelligence Committee, are expected to propose early next week. Their legislation would create a commission to find ways to address technological barriers to law enforcement and national security without making personal and business data more vulnerable to hackers.
Good luck with that. Even now, Apple is reportedly working on a new version of its operating system that would make its iPhones even harder to break into, no matter what a court orders. But just because there's no easy solution doesn't mean Congress shouldn't try to find a better approach. To date, the security issues have seemed maddeningly like a zero-sum game: either create unbreakable data protections but allow terrorists keep their machinations secret, or let law enforcement agencies decode anything they intercept and make everyone else more vulnerable to identity theft, corporate espionage and foreign hackers. It's well worth the effort to look for another way.
If Congress creates the panel, its members should bear in mind that technology is exposing people to more surveillance than ever before, including people who use encryption routinely. From the metadata collected by wireless companies and the cookies generated by websites to the GPS data tracked by mobile networks and app providers, investigators have access to an unprecedented amount of information that's being collected quietly and constantly. It's also crucial that tech and privacy advocates and law enforcement agents be equally represented on the panel, and that they reach a consensus on any recommendation they might make.
The dispute between Apple and the FBI involves a phone used by Syed Rizwan Farook, one of the two attackers who killed 14 people in San Bernardino in December. The iPhone's owner — San Bernardino County — has given the FBI permission to search it, but it doesn't have Farook's passcode. Last week, U.S. Magistrate Sheri Pym ordered Apple to create and install an update onto the phone that would remove the software that kept the FBI from rapidly entering different passcodes until the device was unlocked. Meanwhile, the FBI has asked other courts to require Apple to help it recover data from at least seven other locked iPhones seized across the country.
Apple sought to vacate Pym's order Thursday, arguing that it violated the company's constitutional rights to free speech and to not be conscripted into undermining a key feature of its own products. The latter argument points to an especially troubling aspect of Pym's ruling: setting a precedent for the government to require companies to alter their products or invent new ones to help it gather evidence. That's far too much power to hand over to the government.
Some of Apple's critics — including Justice Department lawyers — argue that the company is putting its marketing needs and its customers' privacy ahead of national security. But they're missing the bigger picture here, which is how exposed data has been and continues to be to online villainy. If courts force Apple to create a program weakening its security in order to help the FBI unlock Farook's phone, the company will soon have to do so for hundreds of phones that agencies have seized, making it all the more likely that such programs will fall into malicious hands. And the precedent set in Apple's case could soon be used to pull down the safeguards installed on any connected device that might have access to data that investigators want to collect — say, a smart TV with a built-in microphone that can be turned on remotely.
The FBI talks about the potential that it will find clues on Farook's iPhone to other, unknown terrorists. But it's even more likely that forcing companies to defeat their own security measures will lead other governments to seek tech companies' help in identifying dissidents or disfavored minorities, for whom unbreakable encryption may literally be a lifesaver. Worse, it undermines the administration's efforts to get consumers and businesses to do more to guard their data against prying eyes. And if anyone should understand how badly this country is doing when it comes to cybersecurity, it's an administration that had thousands of sensitive and secret personnel records stolen from its own computers.