As hackers prove time and again that they can and will invade our digital lives, Apple Inc. has strengthened its security system to make its services nearly impossible to penetrate — even for top cops.
Those seemingly airtight protections are great for the company's millions of customers, and rival device makers have rushed to emulate Apple. But as tech companies build virtual fortresses, authorities are mounting a battle to make sure the tech industry doesn't completely shut them out — as it contends Apple has done by making its iPhone impossible for the FBI to crack.
At the heart of the issue is encryption, a way to secure a digital file by scrambling its contents so that it can be read only by someone who has the key. Tech firms are increasingly encrypting their software, and Apple has been at the forefront.
But sealing off the personal information of customers extends to everyone — the good guys and the bad guys. That's uncharted territory for tech companies, government agencies and consumers, leaving everyone struggling to figure out how far, exactly, encryption protections should extend.
"We need privacy and security, and frankly Apple has done a better job than most," said Mark Mollineaux Pollitt, adjunct professor at Syracuse University and former director of the FBI's Regional Computer Forensic Laboratory Program. "We shouldn't punish them for doing that. We should find a way to broaden that and make that more effective, but we have to realize there are instances where we have to breach that security to protect all of us."
But not everyone sees it as a gray area, where exceptions can be made for extreme cases like terrorism or child pornography.
This week, Apple Chief Executive Tim Cook took a defiant stance, saying his company would fight a court order in the San Bernardino terror investigation that asks the company to develop, for the first time, software that would allow authorities to circumvent the passcode on the encrypted phone.
Apple's decision isn't without critics, who say courts should be the arbiter of where the line is drawn.
"Apple is obstructing the course of law enforcement and effectively aiding terrorists," said Vivek Wadhwa, a corporate governance fellow at Stanford University. "They changed the technology, so they have to keep up with the ability to unlock the device if the government asks them do it. That's not unreasonable."
As it is, the FBI is publicly admitting that it is locked out, said Jeff Kelley, an iOS developer at software firm Detroit Labs who builds apps for iPhones, iPods, iPads and Mac OS.
"If you're Apple, you couldn't ask for a better ad for iPhone encryption."
That's a nightmare for the FBI.
The agency wants to retrieve whatever resides on the iPhone 5c of Syed Rizwan Farook, one of the two slain shooters who killed 14 people in the Dec. 2 attack.
But the smartphone's iOS operating system is locked by a numeric passcode, likely four digits long. The FBI potentially has only 10 guesses before the phone's contents self-destruct.
Older versions of the operating system provided ways for Apple and even law enforcement to access at least some contents on the phone, even if it was password-protected. For example, older iPhone models were susceptible to unlimited password guessing. In other cases, Apple held a master key, and authorities could ship the company an iPhone and get a DVD or hard drive back with the data from it.
But Apple, in effect, threw away its master key when it deployed a new version of iOS in 2014. Farook's phone runs one of the newer iOS versions.
Adding to the FBI's problems is that Apple is judge, jury and executioner when deciding what software runs on an iPhone: An app or program won't work without a special signature from Apple.
The set-up is aimed at stopping viruses or malware from infecting iPhones, and it also gives Apple latitude to ban apps it doesn't want to support, including for competitive or cultural reasons.
That's significantly more control than Google exercises over smartphones running its open-source Android operating system. Many Android phones support so-called unsigned programs, providing one of the key doors through which law enforcement has been able to extract data from locked phones.
About 80% of the world's smartphones run Android, and about 15% run iOS, according to various estimates.
In general, hardware makers have been making stricter security settings a default on their devices. But while those measures and more have deflected thieves, they left room for law enforcement to acquire data when needed.
Not so with Apple. Besides encryption, the company in 2013 introduced Touch ID, a fingerprint scanner that allows users to unlock their phones by pressing their fingers on the device's home button.
This week's court order requires the FBI and Apple to work in tandem to develop a tool that preserves the data on Farook's phone while allowing an app devised by the FBI to input an unlimited number of passcodes until it guesses the right one.
"That has never before been seen," said Kevin Bocek, vice president of threat intelligence and security strategy at cybersecurity company Venafi. "Apple has very aggressively maintained security, and this is the way the government is going to get around it."
Generating the new software for the FBI wouldn't be trivial, but it's certainly doable, experts said.
Apple would need to make the special code run on the iPhone's short-term memory to ensure it doesn't tamper photos, text messages and other potentially critical evidence, said Dan Guido, chief executive of security start-up Trail of Bits.
The code would get rid of the barriers that normally arise when someone tries too many times to guess a user's passcode. Last, it would have to include a funnel for automated guessing, freeing the FBI from manually entering potentially tens of thousands of numeric combinations.
Even though it's technically capable of carrying out the FBI's orders, Apple and its supporters reject the notion that this would be a one-time thing.
Jonathan Zdziarski, one of the top experts on iPhone security, said the work doesn't end there. If Apple ends up creating a tool, it would need to be tested, including by outside forensic specialists, to stand up to legal scrutiny if evidence retrieved from the phone is ever used in court. That vetting process could drag on for months and risks exposing the tool to people with malicious intent.
There's fear that once a safe-cracking tool is developed, law enforcement agencies from all over the world will repeatedly request its use.
"They've brought a phone that would be easy to justify developing a tool for — a terrorist's — but it will be much easier for a court to compel Apple to use it in the future once it's out there," Zdziarski said.
Apple could update iOS to stop the tool developed for Farook's iPhone from working on other ones by requiring consumer consent to run it, but the precedent of making it is irreversible, technologists say.
"You can rationalize it, these are known bad people, this is a known domestic terrorism case and it's one iPhone," said Oren Falkowitz, chief executive of security firm Area 1 Security and a former director of technology and data science programs at U.S. Cyber Command. "But it has implications for all technologies across the globe. We have to be doing more to strengthen the security of the Internet ... or we'll suffer consequences, ... greater than whatever information might be on this one phone."
MORE ON APPLE VS. THE FBI