Apple lawyer Ted Olson said Sunday that the tech heavyweight has good reason not to help federal investigators hack an iPhone that belonged to one of the San Bernardino shooters. The company, he said, "has to draw the line at re-creating code" and "changing" its product.
To which there's a two-word response: seat belts.
And here's two more: air bags.
In both those cases, the auto industry said federal officials had no right to make them tinker with their products. And in both cases, the feds prevailed, arguing that public safety is a more important consideration than automakers' independence.
There are many other examples of government officials telling businesses to fiddle with their products in the name of safeguarding the public — baby cribs, toys, gasoline, mattresses, prescription drugs, aircraft and so on.
"Apple is arguing that the cellphone is a private space and that the user's privacy would be infringed," said Timothy Lytton, a law professor at Georgia State University who specializes in regulation of consumer products. "The government is saying, no, it's something you bring out into the world."
That's the crux of the dispute: Is an iPhone or any other mobile device representative of your most intimate behavior, and thus protected by privacy laws, or is it a ubiquitous consumer product subject to the same oversight as other goods found in everyday life?
At issue isn't what Apple has misleadingly dubbed a "master key" to the world's iPhones. Rather, it's a security feature that basically makes the device useless if too many attempts are made to unlock it.
"We simply want the chance, with a search warrant, to try to guess the terrorist's pass code without the phone essentially self-destructing and without it taking a decade to guess correctly," FBI Director James Comey said in a statement Sunday.
"Maybe the phone holds the clue to finding more terrorists," he said. "Maybe it doesn't. But we can't look the survivors in the eye, or ourselves in the mirror, if we don't follow this lead."
Comey is trying to stake out the moral high ground. Then again, so is Apple Chief Executive Tim Cook.
"We feel we must speak up in the face of what we see as an overreach by the U.S. government," he wrote in response to a court order requiring that his company play ball.
Overreach, however, is a tricky thing to define.
The government can't seek access to a terrorist's iPhone but it can interfere with your personal freedom and make you wear a helmet when you ride a motorcycle? It can't require Apple to write some new code but it can force cigarette companies to write on packages that their product may kill you?
Privacy is a big deal. And if that were the sole issue here, I don't think anyone would say the government should have free rein to root around in your gadgets.
"The government is asking for a modification of a product that implicates an important right," Lytton said. "Your phone is a private sphere of substance, just like your bedroom."
However, the government isn't talking about getting into everyone's cellphones or everyone's bedroom. It wants access to a single phone.
"The question is whether that one phone represents a substantial hazard to the public," said Bill Kitzes, a product-safety expert and former program manager with the Consumer Product Safety Commission. "You could argue that it does."
Privacy only becomes an issue, he said, if you don't trust the government or Apple to keep the potential backdoor created for this single iPhone under wraps.
"If that software got out and anybody who had it could get into anyone's phone, that could be a real problem," Kitzes said.
The government said in a court filing Friday that its demand for Apple to help unlock the shooter's phone "does not give the government 'the power to reach into anyone's device' without a warrant or court authorization," nor does it "compromise the security of personal information."
It said the software the FBI wants Apple to write would remain under the company's control. "No one outside Apple would have access to the software required by the order unless Apple itself chose to share it," the government said.
Can we trust Apple? Can we trust the government? I suspect many people would say no and no.
Distrust of the government is nothing new, and is arguably well deserved (yes, looking at you, NSA).
As for Apple, it's worth noting that the company has a long track record of ratting out customers to the feds. In the first half of last year, according to the news site Quartz, Apple received 971 government requests for user data. It complied with a hefty 81% of those requests.
I don't buy that Apple has seen the light on users' privacy. I think the company is more concerned that if it caves here, what will happen when any other government — China's, say — also demands security-related product changes?
That's a reasonable worry. And it's the company's most convincing argument in favor of why the feds should back off.
Apple is on less sure ground when it argues, as Olson did, that the government can't make it "re-create code" or "change the iPhone."
It can, when public safety is a factor. Again, seat belts.
We need to protect what little privacy we have left. But In this case, and this case alone from a broader privacy perspective, I think the government is right.
Hack the damn phone, Apple.
MORE FROM DAVID LAZARUS