The FBI wants to hack into that Apple iPhone 5C, which was shooter Syed Rizwan Farook’s work phone from the San Bernardino County Department of Health. Farook and his wife, Tashfeen Malik, died in a firefight with police hours after the attack.
Authorities and Apple Inc. held discussions for about a month about ways to unlock the device, coming up with no amicable solution.
It is the tech giant’s policy to require law enforcement to obtain search warrants or subpoenas before aiding in investigations. That prompted the FBI to seek a court order requiring Apple’s assistance, which a U.S. magistrate granted Tuesday.
Apple Chief Executive Tim Cook’s public vow to fight the order set off a worldwide debate about whether technology providers should be able to protect users’ data so securely that law enforcement is effectively shut out from ever getting it. Senior Apple executives underscored Friday that they have no intention of backing down. Federal prosecutors and Apple also separately disclosed new details about what transpired privately in the weeks leading up to their very public legal battle this week.
Here’s a look at some of the key points in the battle.
Apple’s software deters the FBI from attempting to guess Farook’s passcode.
The FBI potentially has only 10 guesses before the phone’s contents self-destruct, and time delays are imposed between incorrect guesses to discourage rapid-fire attempts.
Adding to the FBI’s problems is that Apple is judge, jury and executioner when deciding what software runs on an iPhone: An app or program won’t work without a special signature from Apple.
The set-up is aimed at stopping viruses or malware from infecting iPhones, and it also gives Apple latitude to ban apps it doesn’t want to support, including for competitive or cultural reasons. This hinders the FBI from developing its own tool to get around the passcode issues. ( Infographic: How the iPhone’s security measures work)
The FBI wants a way to easily guess Farook’s passcode.
This week’s court order requires the FBI and Apple to work in tandem to develop software that preserves the data on Farook’s phone while allowing an app devised by the FBI to input an unlimited number of passcodes until it guesses the right one.
It marks the first time Apple has been ordered to develop software that would allow authorities to circumvent the passcode on an encrypted phone. (Read the full story: Apple CEO says helping FBI hack into terrorist’s iPhone would be ‘too dangerous’)
Generating the new software for the FBI wouldn’t be trivial, but it’s certainly doable, experts said.
The code would get rid of the barriers that normally arise when someone tries too many times to guess a user’s passcode. And it would have to include a funnel for automated guessing, freeing the FBI from manually entering potentially tens of thousands of numeric combinations.
Tashfeen Malik, left, and Syed Rizwan Farook at O’Hare International Airport in Chicago on July 27, 2014. (U.S Customs and Border Protection)
Apple is refusing a court order because it says honoring the warrant sets a dangerous precedent.
“Up to this point, we have done everything that is both within our power and within the law to help [investigators],” Cook wrote this week. “But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create.” (Read the full story: Apple opposes order to help FBI unlock phone belonging to San Bernardino shooter)
There’s fear that once a safe-cracking tool is developed, law enforcement agencies from all over the world will repeatedly request its use.
The magistrate in the San Bernardino case directed Apple to target its software to only Farook’s phone, but swapping the unique identifier of his device for someone else’s could be simple as long as Apple lends its signature, computer forensics experts said.
Apple could update iOS to stop the tool developed for Farook’s iPhone -- if it is eventually created -- from working on other ones, or by requiring consumer consent to run it, but the precedent of making such a workaround is irreversible, technologists say.
“If this decision is upheld, it would mean the FBI could get a judicially mandated back door into any device to get access to its content, and it would mean a weakening of encryption in all those devices,” said Gregory T. Nojeim, director of the Freedom, Security and Technology Project at the Center for Democracy & Technology.
Apple’s decision isn’t without critics, who say courts should be the arbiter of where the line is drawn.
“Apple is obstructing the course of law enforcement and effectively aiding terrorists,” said Vivek Wadhwa, a corporate governance fellow at Stanford University. “They changed the technology, so they have to keep up with the ability to unlock the device if the government asks them do it. That’s not unreasonable.”
Others have similar opinions. (Read the full story: In San Bernardino, where terrorists struck, residents debate FBI vs. Apple)
The FBI says it’s about this one phone. Security experts think it’s a bigger deal.
Investigators are hoping the data on the iPhone will help answer several questions that have persisted since the shooting. It remains unclear why Farook left a bag with several pipe bombs in the conference room where he and his wife opened fire, why the bombs were not detonated, or if the couple were plotting other attacks.
Additional evidence also could prove valuable in the case against Enrique Marquez Jr., a friend of Farook’s, who has pleaded not guilty to buying two rifles used in the shootings and providing material support for terrorists and other crimes.
Location data on the phone, among other pieces of information, could also help investigators answer questions about the couple’s movements during an 18-minute gap in the FBI’s timeline of their actions following the shooting.
Oren Falkowitz, chief executive of security firm Area 1 Security and a former director of technology and data science programs at U.S. Cyber Command, is among those who said the order has ramifications beyond this one case.
“We have to be doing more to strengthen the security of the Internet ... or we’ll suffer consequences ... greater than whatever information might be on this one phone,” he said. (Read the full story: Battle lines drawn over encryption as Apple rebuffs FBI)
IPhone encryption has already thwarted many cases. The Manhattan district attorney’s office said it has been locked out of 176 Apple devices, or about 26% of the 670 Apple devices its lab has checked out since October 2014. The cases included homicide, sexual abuse of a child and sex trafficking, according to the report.
“The result will be crimes that go unsolved, harms that go unanswered, and victims who are left beyond the protection of the law,” the report said.
Prosecutors have harshly criticized Apple, accusing it of refusing to comply to protect its brand.
Federal prosecutors contended in a motion Friday that the company was “not above the law” and could easily help the government unlock one terrorist’s smartphone without undermining anyone else’s privacy.
“Apple’s current refusal to comply with the court’s order, despite the technical feasibility of doing so, instead appears to be based on its concern for its business model and public brand marketing strategy,” prosecutors wrote in a filing asking the court to compel Apple to immediately aid the FBI.
Prosecutors on Friday also sought to downplay the issue of encryption technology in the terror case, arguing that the software it’s seeking from Apple amounts to an innocuous update. The company regularly issues updates that modify settings, the court filing said. (Read the full story: Feds slam Apple, saying it could easily help unlock iPhone and is ‘not above the law’)
With Farook dead and his employer, which owns the iPhone, consenting to the search, using the requested software would not invade anyone’s privacy and wouldn’t undermine encryption, prosecutors contend. An additional safeguard is the phone would need to be in Apple or authorities’ possession for the proposed tool to work, according to the company.
“This court should not entertain an argument that fulfilling basic civic responsibilities of any American citizen or company -- complying with a lawful court order -- could be obviated because that company prefers to market itself as providing privacy protections that make it infeasible to comply with court-issued warrants,” prosecutors said.
In addition, the federal government reiterated its stance that Apple has the means to fulfill the court order. As of Wednesday, the company was still weighing how complicated it would be to develop such a tool for the FBI.
“At no point has Apple ever said that it does not have the technical ability to comply with the order, or that the order asks Apple to undertake an unreasonable challenging software development task,” prosecutors wrote Friday. “On this point, Apple’s silence speaks volumes.”
Legal scholars say the FBI’s argument is an ‘unprecedented’ stretch of an old law.
“This is a new frontier,” said Jennifer Granick, director of civil liberties at Stanford Law School’s Center for Internet and Society. “I know of no other statutory provision that would arguably create an obligation for device manufacturers to help out the government.”
UC Irvine School of Law Dean Erwin Chemerinsky said a carefully drafted federal law giving law enforcement the right to get around encryption in certain compelling situations probably would be constitutional.
But he doubted a court could force a company to write software. “You can’t subpoena or get a warrant for something that doesn’t exist,” he said.
The foundation for the order is a 1789 law called the All Writs Act. The act, passed in the judiciary’s infancy, allowed courts to issue orders if other judicial tools were unavailable.Apple has turned over information based on the All Writs Act to law enforcement about 70 times in recent years, according to the government.
Law enforcement has relied on a 1977 Supreme Court ruling that said the All Writs Act could be used to compel New York Telephone Co. to provide technology to enable investigators to track calls being made in a gambling operation. The phone company was a heavily regulated public utility and already had the technology, key differences from the Apple case, experts said. (Read the full story: Apple-FBI fight over iPhone encryption pits privacy against national security)
The legal battle is expected to kick off next month.
The case, which will be heard in the magistrate’s courtroom next month, will then go before a federal district judge. Apple has until Feb. 26 to file its initial arguments in the case.
If appealed, the case will be heard by the U.S. 9th Circuit Court of Appeals and possibly the U.S. Supreme Court.
Apple has hired Ted Olson and Theodore J. Boutrous Jr., two of the lead lawyers who successfully challenged California’s previous ban on same-sex marriage.
They are expected to argue the order violates constitutional provisions as well as the All Writs Act and would create bad public policy. Fulfilling the government’s demand would place an unreasonable burden on the company, they could contend.
Law enforcement had easier access to iPhone data previously.
Older versions of the iPhone’s iOS operating system provided ways for Apple and even law enforcement to access at least some contents on the phone, even if it was password-protected.
For example, older iPhone models were susceptible to unlimited password guessing. In other cases, Apple held a master key, and authorities could ship the company an iPhone and get a DVD or hard drive back with the data from it.
But Apple, in effect, threw away its master key when it deployed iOS 8 in 2014. Farook’s phone runs one of the newer iOS versions.
Alternatively, the government has sometimes been able to get iPhone data through an iCloud backup.
But iCloud doesn’t store all the data on the phone, and in Farook’s case, the FBI argues he intentionally disabled the iCloud function six weeks before the shooting. Any communications during that time that may be linked to the shooting, as well as location data that might help the FBI map the movements of Farook and his wife before and after the attack, are accessible only through the phone itself, the government said.
Also a problem: Within 24 hours of the shooting rampage, the phone’s owner — possibly Farook’s employer, the San Bernardino County public health department — reset the password to Farook’s iCloud account to access data from the backup, according to Apple and federal officials.
That means the iCloud password on the iPhone itself is now wrong, and it won’t back up unless someone can get past the phone’s passcode and change it.
The issue was discovered after Apple engineers sent to Southern California to work with the FBI struggled to trigger an automatic backup, Apple said. When iCloud is enabled, iPhones automatically sync with the cloud if they are charging and are connected to a familiar Wi-Fi network. (Read the full story: Apple and feds reveal San Bernardino shooter’s iCloud password was reset hours after attack)
Apple executives and security experts also are unsure about whether Farook disabled the backup function. Among the possibilities: An iPhone operating system update Oct. 21 could have disrupted iCloud settings; the iCloud storage space could have been full; or Farook may never have returned to a location where the automatic backup would have been activated.
The iPhone 5C does not have a Touch ID fingerprint sensor, removing another possible key to the phone.
Getting data off Android phones is often easier.
Google exercises minimal control over smartphones running its open-source Android operating system. Many Android phones support so-called unsigned programs, providing one of the key doors through which law enforcement has been able to extract data from locked phones. Devices such as Cellebrite and software such as EnCase aid the effort.
FBI Director James Comey testifies before the Senate Intelligence Committee hearing on worldwide threats to America and its allies, on Capitol Hill in Washington, D.C., on Feb. 9. (Molly Riley / AFP/Getty Images)
Lawmakers and the FBI are pushing for a permanent backdoor into all cellphones, but cybersecurity experts say backdoors could create problems for U.S. tech companies who do business in places like China and Europe.
Recently, FBI Director James Comey, Atty. Gen. Loretta Lynch and other national security leaders met with representatives from Google, Apple and Facebook in San Jose to try to find common ground that would help investigators gain crucial information about possible terror plots without compromising the privacy of the companies’ customers.
Although the tech industry says it wants to help, it’s reluctant to give away private information and data to government agencies, arguing that doing so fosters user distrust and raises the risk of hacker attacks.
When it comes to doing business abroad, tech companies are being squeezed on both sides: If they don’t give up access to user data, they risk angering governments. But if they are perceived as selling products that aren’t secure, consumers won’t buy them. And that hurts the all-important bottom line.
If Apple is forced to comply, it could bolster recent efforts by countries such as China to curb its citizens’ privacy in the name of national security.
“This completely undermines privacy overseas and if the administration thinks this precedent wouldn’t be used by China, Russia and others then they are in serious error,” said Nicholas Weaver, a senior researcher at the International Computer Science Institute at UC Berkeley.
Should an iPhone belonging to a suspected terrorist from China’s fractious Xinjiang province require decryption, Beijing, along with popular opinion, wouldn’t afford Apple the ability to argue its case like it’s doing in the U.S.
Cook is leading the charge, but several tech CEOs are backing him.
Cook’s hard-line stance on privacy could define his legacy at Apple and set the tone for the way big corporations deal with big government at a time when so much of our lives unfold on the devices we use every day. How far Cook is willing to take the fight is being tested on a national level now. (Read the full story: Tim Cook’s stance on privacy could define his Apple legacy)
Google Chief Executive Sundar Pichai and Twitter Chief Executive Jack Dorsey have tweeted supportive responses. Jan Koum, chief executive and co-founder of mobile messaging app WhatsApp, backed Cook in a Facebook post. Facebook itself issued a statement Thursday afternoon that did not mention Apple or Cook by name.
Several other tech companies and their leaders have said nothing, or have talked about encryption and security in more general terms. Yahoo, Microsoft and AOL each signed joint statements made by larger tech company associations, but spokespeople for those companies said Thursday that the firms would not comment beyond those statements. (Read the full story: Apple vs. the FBI: Facebook, Twitter, Google, John McAfee and more are taking sides)
Encryption could become a presidential election issue.
GOP front-runner Donald Trump said he was floored that Apple had not volunteered to aid the FBI.
“Who do they think they are?” he asked on Fox News this week.
Speaking to reporters in South Carolina, Sen. Marco Rubio said he hoped the tech giant would voluntarily comply with the government’s request, but acknowledged the court order is far from a simple issue.
A Pew Research poll last year found 54% of Americans disapprove of the U.S. government’s collection of telephone and Internet data to help fight terrorism.
Times staff writers Brian Bennett, Paresh Dave, Maura Dolan, Victoria Kim, Tracey Lien, Julie Makinen, David Pierson, James Queally, Joel Rubin and Richard Winton contributed to this report.
Chat with me on Twitter @peard33
6:08 p.m.: This article has been updated to add comments from senior Apple executives about iCloud features, how the tool proposed by the FBI would work and their view on Friday’s court filing by prosecutors.
1:50 p.m.: This article has been updated to add information about a court filing Friday by the federal government that requests a ruling compelling Apple to immediately aid the FBI.
1:02 p.m.: This article has been updated to add that experts said swapping the unique identifier of Farook’s device for someone else’s could be simple as long as Apple lends its signature.
This article was originally published at 11:15 a.m.