That legal mess between the
The FBI's declaration Monday that it could hack into an
Though a momentary reprieve for Apple and its peers, the tech industry's reaction to the FBI's decision contained more warning than celebration.
Perhaps with good reason: The FBI's move to dismiss legal actions against Apple in the investigation of the San Bernardino attack does little to settle the heated back-and-forth between law enforcement agencies seeking to expand their crime-fighting toolbox and tech firms fearful of being compelled to work at the behest of the government, executives and experts said.
"This entire experience has shown we need to have much broader conversation around the policy, regulation and laws in a digital world and what does it mean to have secure technology," Aaron Levie, chief executive of online storage provider Box Inc., said in an interview.
Denelle Dixon-Thayer, chief legal and business officer at software firm Mozilla, said in a statement that nothing had changed as far as the "need to have the broader discussion of what limits should be placed on law enforcement's ability to compel assistance from tech companies."
Since the fight between Apple and the FBI became public in early February, cybersecurity experts have repeatedly said that the FBI could find a way into the gunman's iPhone 5c on its own given enough resources and time.
"Software is so huge, especially an operating system, you can never make it 100% secure," said Will Strafach, who runs software firm Sudo Security Inc. "Things are just too insecure to argue the only way in is through the manufacturer."
Now, tech officials have a clear-cut example proving that argument, experts said.
"Courts that hear claims that the FBI can't break into a cellphone will receive those claims more skeptically," said Gregory T. Nojeim, senior counsel at the Center for Democracy & Technology. "Congress will be more hesitant than it already was to enact legislation requiring Apple to build in back door. I think that idea might well be taken off the table now."
Some members of Congress are already seeking to create a National Commission on Digital Security that would include tech executives, privacy advocates, law enforcement officials and academics.
Apple said Monday that it "remains committed to participating" in "a national conversation about our civil liberties, and our collective security and privacy."
Without national consensus on the limits of law enforcement, disputes like the one in San Bernardino "will literally happen thousands of more times over the next many decades," Levie said.
If anything, the experts and executives said, Monday's news raises new questions about how far law enforcement can go to hack into phones.
What's "appropriate" and what "should be out of bounds?" Nojeim said. "What's the FBI going to do with the [hacking technique] that it's just developed?"
Strafach called it problematic for Apple that the FBI holds a way to crack security measures of at least one specific iPhone -- the work device of San Bernardino terrorism attack gunman Syed Rizwan Farook. No company wants its device to be susceptible to hacking. But the existence of that flaw is the "lesser of two evils" when compared with what could have been a potential judicial order forcing Apple to develop and deploy software against its will.
Ordering tech companies into "forced labor" would have been a process the FBI replicated, Strafach said. But the agency will have to be more cautious with its new hacking technique because unfettered use could end up exposing the details of the method, he said.
"The nice thing is by nature it has to be carefully used and in situations where it's absolutely needed," Strafach said. "If they don't keep it guarded, it will get out there and Apple will be able to fix it."
It's unclear when, or if, the day will come when most tech security holes are no longer relatively easy to access; Strafach put it at least 20 years away.
But the "broader discussion" needs to address that as people demand greater protection, experts said.
"The industry isn't spending money [on improving security] and prefers to simply apologize when its products are hacked or plead ignorance and blame the users," said Vivek Wadhwa, a corporate governance fellow at Stanford University. "This must change and we must hold tech companies liable for their inferior products."