Advertisement

Column: Robotics-law expert Ryan Calo weighs in on drone regulations -- and ‘drunk droning’

Ryan Calo, now a law professor at the University of Washington, stands with a robot at Stanford University in 2009. He specializes in the laws governing robotics.
(Paul Sakuma / Associated Press)
Share via

Hard cases, said a long-ago Supreme Court justice, make bad law. The startling outliers shouldn’t be the yardstick for crafting routine criminal law. When a tipsy off-duty employee of the National Geospatial-Intelligence Agency lost control of his friend’s drone last month and smashed it onto the White House lawn, the cry went up for more drone regulation. But the incident was an oddity; the real legal questions about drone regulation have to do with privacy, policing, commerce and other uses. Ryan Calo, a law professor at the University of Washington, specializes in robotics. The White House drone flew right onto his radar.

The White House crash had even the president asking if we were doing enough to regulate drones.

You can’t fly a drone around D.C. — it’s unlawful — so I’ve been confused by calls for additional legislation. What’s been interesting is the company that manufactured the drone. It’s fair to say they overreacted by creating a firmware update to prevent using that drone in the region.

Advertisement

Why is it overreacting?

We shouldn’t impose heavy restrictions on what amount to toys, let alone require their firmware to restrict flight. Basically, when you lock down drones for one purpose, you set a bad precedent of taking control away from innovators and owners. I think that the FAA should be more permissive about the commercial use of drones. They want commercial drone operators to hire a professional pilot. That seems like overkill.

In the past, how did the law cope with new technology, like cars, that one person controlled but that affected others?

Advertisement

With cars, a lot of early case law involved people scaring horses, because a new technology has unintended consequences. The law will strike one balance, and then we’ll get comfortable [with the new technology], and the law will strike a different balance.

The three big challenges for robotics laws are, one, that software can suddenly touch you; it’s not just your computer losing your homework but [doing someone] physical harm. Two, that these things will behave in ways that are useful but surprising. Three, that there’s a social valence where we react more viscerally to such technology, and the law has to take that into account.
Technology always seems to outstrip our ability to legislate its consequences.

The pace of change is faster than the pace of legislative or judicial change. These are difficult things for legislators to predict. People put restrictions on Segways, but it ended up that the Segway wasn’t a big deal. The Electronic Communication Privacy Act passed in 1986, based on technology in which storage of digital stuff was very expensive and systems would routinely purge information to make room for new information. Today we store everything indefinitely; this law has been outdated for 10 years.

You regard drones as a catalyst for privacy laws. Where are the boundaries? If I saw one hovering at my window, I’d be inclined to take a baseball bat to it.

Advertisement

Tort law is going to look at you taking a baseball bat to a drone as two different claims: your claim against the drone operator for trespassing, and the operator’s claim against you for trespassing on his chattels. So you’re going to end up potentially suing each other. If the drone were to fly at your face, you’d be excused under the defense-of-self doctrine. But you can’t just take a bat to something.

Privacy law is not as protective as it should be. It’s hard to sue people for spying on you in your back yard. It’s hard to get police to exclude evidence from a drone or aerial photography. And it’s not exclusively about drones; drones are a good example but not the only example.

Aren’t crime and terrorism the big drone fears?

Almost every credible threat you could accomplish with a drone, you could accomplish with a football. There are certain things drones make easier. If the concern is about an explosive too close to the White House, walking up to the gates and throwing a football full of explosives is one way to accomplish that; another is with a drone. Why ban drones but not balls? If the scenario is as horrific as you’re imagining, those people will not be dissuaded by additional laws.

Arguments about gun regulation often go this way: “Why not regulate butter knives? You can kill with those too.” Butter knives are made to spread butter; guns are made to put holes in things. Likewise, balls are made for sports, drones to get into places where humans cannot and maybe should not go.

I take your point. It’s harder to characterize drones as being an entirely innocuous tool, the way you would with a butter knife, nor do I think it goes so far as being like a gun. Certain activities are made easier by technology, in some cases so easy that we’re made uncomfortable.

Advertisement

[But] artificial intelligence and robotics are interesting precisely because of what they allow people to do. That’s what makes them empowering. Let’s say you want the best surgeon; that surgeon could be in Tokyo. You could get surgery in L.A. from that surgeon through robotics.

What is the answer? New laws? Industry self-regulation?

I think it’s very unwise to disallow a whole platform from doing certain things. You are curtailing the prospects of innovation.

And it’s a slippery slope. If there’s a protest, and police have the ability to affect your drone at a distance, all of a sudden you can’t exercise your 1st Amendment rights to monitor the police. This sort of thing has happened. [San Francisco’s rapid transit system shut down cellphone service on its train platforms to head off a potential protest in 2011.]

A better way to manage this is to throw the book at people who endanger others with their drones. A Pepperdine colleague, Gregory McNeal, says we have these two-ton machines that can veer off and run into each other — and we protect against that with a line of paint.

Driverless cars are another emerging robotic technology; California is still trying to come up with standards for them too.

I worry, and I’m not alone, that there isn’t the expertise in government to deal with robotics, whether it’s drones or driverless cars.

Advertisement

The legal liability issues around driverless cars are going to be pretty easy. If you build a product to get from point A to point B safely, and it doesn’t do that, you’re going to be liable.

It becomes tricky when you have an app-enabled platform, like your smartphone. Imagine you run a third-party app, and that app ends up doing something problematic. It’s no longer the manufacturer or the operator that is responsible. It’s [app] code that literally anybody in their basement, anybody in Russia could have written.

The optics suggest you should sue the manufacturer because its robot ran into you and hurt you. But [the injury] is in part a product of a third-party app. I have proposed immunizing manufacturers of open robotic systems for the apps people run but also being more careful in selecting what apps to sell.

What was your first thought when you heard about the “drunk droning” incident at the White House?

Thanks a lot, one person, for sending us in the wrong direction. We’re now talking about locking down platforms and extra regulation just because of one guy.

On Twitter, some of us had this hashtag going, #drunkdrone songs. Remember that Jimmy Buffett lyric, “Why don’t we get drunk and screw?”? My first one was, “Why Don’t We Get Drunk and Screw Up Drone Policy in the United States?”

Advertisement

This interview has been edited and condensed.

patt.morrison@latimes.com

Twitter: @pattmlatimes

Follow the Opinion section on Twitter @latimesopinion and Facebook

Advertisement