Apple Inc. refused to give the FBI software the agency desperately wanted. Now Apple is the one that needs the FBI’s assistance.
The FBI announced Monday that it managed to unlock an iPhone 5c belonging to one of the San Bernardino shooters without the help of Apple. And the agency has shown no interest in telling Apple how it skirted the phone’s security features, leaving the tech giant guessing about a vulnerability that could compromise millions of devices.
“One way or another, Apple needs to figure out the details,” said Justin Olsson, product counsel at security software maker AVG Technologies. “The responsible thing for the government to do is privately disclose the vulnerability to Apple so they can continue hardening security on their devices.”
But that’s not how it’s playing out so far. The situation illuminates a process that usually takes place in secret: Governments regularly develop or purchase hacking techniques for law enforcement and counterterrorism efforts, and put them to use without telling affected companies.
Now that the FBI has dropped its case against Apple, there’s a new ethical dilemma: Should tech companies be made aware of flaws in their products, or should law enforcement be able to deploy those bugs as crime-fighting tools?
It’s unclear whether the FBI’s hacking technique will work on other versions of the iPhone, though a law enforcement official who spoke on the condition of anonymity said its applications were limited.
Some news outlets citing anonymous sources have identified Israeli police technology maker Cellebrite as the undisclosed third party helping the government, but neither the company nor the FBI has confirmed those reports.
A source who is unauthorized to discuss the case told The Times the FBI was provided with the ability to incorrectly guess more than 10 passwords without permanently rendering the phone’s data inaccessible. That allowed the agency to use software to run through potential pass codes until it landed on the correct one. It is not clear what info, if any, was gleaned from the phone.
The FBI could argue that the most crucial information is part of a nondisclosure agreement, solely in the hands of the outside party that assisted the agency, or cannot be released until the investigation is complete.
Many experts agree that the government faces no obvious legal obligation to provide information to Apple. But authorities, like professional security researchers, have recognized that a world in which computers are crucial in commerce and communications shouldn’t be riddled with technical security flaws.
Even the White House’s cybersecurity coordinator has acknowledged there are times when more people could be harmed by an unfixed security issue than helped by the government covertly using the loophole as part of an investigation.
A secretive White House-led procedure governs whether companies get notified of potential flaws.
Officials involved in the multi-agency deliberations — called the Vulnerabilities Equities Process — consider the risks and rewards of keeping flaws secret, according to federal records. They weigh whether the government could get the information in some other way and how likely it is someone else will discover the same vulnerability.
Federal officials have maintained that they lean toward private disclosure of a newly discovered vulnerability in the majority of cases.
The National Security Agency, though it denies the claim, reportedly took advantage of a flaw in the way websites transmit sensitive data for two years before private researchers uncovered the issue in 2014. Attorneys in two other cases have accused the FBI of using bugs in the Tor Internet browser to identify suspected criminals.
Apple’s anxiety is understandable. No tech company wants a major security gap in its products — and most are given months of warning to fix issues before they are made public by the researchers who discover them.
That’s why Apple sees the government holding a moral obligation to disclose details of its hacking technique.
“Apple’s best chance is to make a compelling case that the disclosure of this exploit is in the interest of national security, as in, if it remains undisclosed and undiscovered, it potentially puts innocent users at risk of data breach,” AVG’s Olsson said.
Apple stated in court filings that part of the reason its executives feared developing software to circumvent iPhone security features was that once created, it could end up in the wrong hands. That same argument could come into play with the disclosure issue if Apple makes a public plea that the government and the outside group can’t properly safeguard the technique. Last year, an Italian company that bought and sold bugs saw its entire database leaked onto the Internet. The security issue could explain why the FBI and the outside party are being so secretive about the process.
The San Bernardino situation changes the dynamics, providing a reason for “cybercriminals and amateur hackers to come out of the woodwork,” said Peter Tran, a general manager at RSA’s advanced cyber defense group.
Although someone helped the FBI crack the iPhone, probably in exchange for money, other people who stumble upon the same hacking technique could choose to sell to cyberthieves or other governments. An extensive underground online network, concentrated in Eastern Europe, does just that everyday, Bocek said.
Apple generally doesn’t reward bug-finders with cash. But given the publicity in this instance, experts said Apple could turn to the black market too.
“It proves once again that what you don’t know, you can buy,” said Nikias Bassen, principal mobile security researcher at Zimperium.
Times staff writers James Queally and Richard Winton contributed to this report.
MORE ON APPLE VS. FBI