Advertisement

Editorial: The FBI wants Apple to pry into your iPhone

A judge has ordered Apple to provide the FBI with software to bypass the security in the iPhone 5C used by Syed Rizwan Farook.

A judge has ordered Apple to provide the FBI with software to bypass the security in the iPhone 5C used by Syed Rizwan Farook.

(Carolyn Kaster / Associated Press)
Share via

Eleven weeks after the terrorist attack that left 14 dead in San Bernardino, the Federal Bureau of Investigation is still trying to answer some nagging questions about the actions and motives of the shooters, Syed Rizwan Farook and his wife, Tashfeen Malik. As it happens, they have Farook’s iPhone 5C, which belonged to his employer, the San Bernardino County Department of Public Health, as well as the department’s permission to examine it. What they don’t have is Farook’s passcode, which he took to his grave in a shootout with law enforcement officers on Dec. 2. Nor, for that matter, does Apple, which decided in 2014 to stop storing copies of its customers’ passcodes to make them less accessible to hackers.

Without the passcode, it’s virtually impossible for the feds to decrypt the data stored on the phone — the passcode not only unlocks the device, but it also acts as a crucial part of the decryption key. So attorneys at the Justice Department asked Apple to invent the technology needed to circumvent the security features on Farook’s device. Apple refused, and on Tuesday a federal magistrate in Riverside ordered the company to do so.

Once [special decryption] capabilities are created, it’s only a matter of time before hackers find a way to use them for their own malicious ends.

Advertisement

That order, if upheld, would dangerously extend of the government’s power over private industry, establishing a precedent for courts to require companies to create features that serve the government’s interests, not the public’s. And it’s hard to see where that new authority might end. Apple has hinted strongly that it will appeal the order, as well it should.

The tussle over Farook’s iPhone is the latest in an escalating series of fights between federal law enforcement agencies and tech companies over the spread of strong encryption features on smartphones, computers and other digital devices. FBI Director James Comey has warned repeatedly that selling devices that scramble data automatically will only help terrorists and criminals go “dark,” masking the digital footprints they used to leave for police and prosecutors. Apple, Google and other tech companies have pushed back, arguing that more people should be encrypting their data routinely to protect themselves against the scourge of hackers and identity thieves.

Much of the debate has revolved around the question of whether the feds should be guaranteed a way to unscramble the data on the encryption-enabled devices they seize. Without that ability, Comey says, plots that might otherwise have been foiled will instead be carried out, and eventually people will die as a result. But in the minds of tech company executives and their allies, there’s no way to give the government a “back door” into locked devices without providing a way in for hackers and foreign governments.

Advertisement

Agents have gathered plenty of evidence in the San Bernardino case, including data from Farook’s phone that he copied to an online account on Apple’s iCloud data back-up service. But they say Farook apparently turned off the automatic back-ups about a month and a half before the shooting, and they contend that data hidden on the phone could reveal more about the attackers’ communications (possibly revealing other plotters) and movements during that period.

On older iPhones such as Farook’s, a passcode may be entered incorrectly nine times; after the 10th wrong entry, the device erases all the data the user had stored on it. At the FBI’s request, Magistrate Judge Sheri Pym ordered Apple to create a program that would trick Farook’s phone into accepting what it thought was an official software update, but in fact was code that allowed the FBI to quickly enter as many passcodes as needed to find the correct one.

Such a program would be incredibly valuable to mercenary hackers and foreign governments, which could adapt it to use on any older iPhone. China, in fact, toyed with requiring companies to provide the keys to any encrypted content on the devices they sold, but backed off in the face of international blowback. If the United States compelled Apple to hack an iPhone, it would set a precedent that China and other repressive regimes would surely follow.

Advertisement

And if the government could force the creation of technology to decrypt a device, what other capabilities might a court require companies to provide in the name of law enforcement or national security? Could a court order a computer maker to enable FBI agents to turn on a laptop’s webcam remotely to search for a suspect? Could it demand the creation of software that automatically and surreptitiously sends agents a copy of any text entered into a smartphone? Bear in mind that once those capabilities are created, it’s only a matter of time before hackers find a way to use them for their own malicious ends.

At least one security expert who’s worked on the iPhone says that it’s technically possible to do what the judge has ordered on an iPhone 5C but not on newer models. So as those models proliferate, the courts may not be able to force Apple to do what the FBI is demanding for Farook’s phone. Unless Pym’s ruling is reversed, however, a precedent will be set for governments around the world, and potential back doors opened for the hackers Apple and its rivals are trying to block.

Follow the Opinion section on Twitter @latimesopinion and Facebook

Advertisement