An Uber robot car kills a pedestrian in Arizona. Will it slow driverless tests?

An Uber driverless car is parked in a garage in San Francisco in 2016.
An Uber driverless car is parked in a garage in San Francisco in 2016.
(Eric Risberg / Associated Press)

A woman walking across an Arizona road was hit and killed by a self-driving car from Uber Technologies Inc. on Sunday — probably the first pedestrian fatality for the fast-growing movement toward driverless vehicles.

Tempe police said Elaine Herzberg, 49, was struck by an Uber test vehicle operating in autonomous mode. In response, Uber halted its driverless operations in San Francisco, Pittsburgh, Toronto and the Phoenix area and said it is assisting in the investigation. “Our hearts go out to the victim’s family,” the company said in a statement.

The death is likely to slow the deployment of driverless vehicles in Arizona, one of the most welcoming states for the industry, and elsewhere as policymakers monitor the investigation and assess public reaction. Driverless vehicle regulations have been passed by the U.S. House of Representatives but are stalled in the Senate over safety concerns.


“This is going to focus a lot of attention on how companies are approaching their systems, their technologies, and their management of tragedies,” said Bryant Walker Smith, a driverless vehicle expert at the University of South Carolina School of Law.

John Simpson of Consumer Watchdog in Santa Monica, a longtime critic of liberal driverless vehicle regulations, called for a national moratorium on robot car testing on public roads until the accident is analyzed.

“Arizona has been the wild west of robot car testing with virtually no regulations in place,” Simpson said in a statement. “That’s why Uber and Waymo test there. When there’s no sheriff in town, people get killed.”

Arizona has few laws restricting the use of driverless vehicles. It doesn’t require remote operators for the vehicles, for instance, and allows driverless trucks on the road.

Arizona Gov. Doug Ducey greeted Uber to his state with great enthusiasm in late 2016. At the time, California’s Department of Motor Vehicles had just revoked Uber car registrations after the company refused to apply for required state permits for its driverless cars. Uber loaded its driverless-equipped Volvos onto trailer trucks bound for Arizona.

“Arizona welcomes Uber self-driving cars with open arms and wide-open roads. While California puts the brakes on innovation and change with more bureaucracy and more regulation, Arizona is paving the way for new technology and new businesses,” Ducey said at the time.


On Monday, the governor’s office expressed condolences for Herzberg’s family and said “public safety is our top priority.”

In February, California issued a set of revised regulations that will allow robot cars with no human driver to operate on public roads through a permit system that begins April 2. The regulations also pave the way for driverless ride-hailing companies to begin picking up paying passengers.

In a prepared statement Monday, the California DMV said it “takes the safe operation of our autonomous vehicle permit holders very seriously. We are aware of the Uber crash in Arizona, but we have not been briefed on the details of the crash at this time. We plan to follow up with Uber to get more information.”

As self-driving cars roll out in pilot programs around the world, the chances of a pedestrian death have increased. Experts have wondered what effect deadly crashes would have on the industry.

“We’re within the phase of autonomous vehicles where we’re still learning how good they are. Whenever you release a new technology, there’s a whole bunch of unanticipated situations,” said Arun Sundararajan, a professor at New York University’s business school.

Driverless cars are supposed to eliminate some of the risks of human error, particularly because their sensors are always paying attention to their surroundings. They don’t drink, do drugs, check text messages, get tired or get distracted by misbehaving children. Annual traffic fatalities in the U.S. are fast approaching 40,000, with driver error to blame in more than 90% of them.


But there isn’t enough experience with driverless vehicles yet to know all their weaknesses and whether they’re safer than human drivers and if so by how much. Nidhi Kalra, co-director of the Center for Decision Making Under Uncertainty at the research group Rand Corp., recently told The Times that robot cars have proved less likely to get into minor crashes than human drivers. But “when it comes to injuries and fatalities, we won’t be able to know until we’ve had hundreds of millions or billions of miles” of driving history.

Whether policymakers allow enough robot cars on the road to accumulate that kind of experience depends in part on how Uber and other driverless vehicle companies handle fatalities, Smith said.

“Today there are going to be 100 people who are going to die on the roads in the United States, principally caused by human error,” he said. The general public is “going to connect with the issue through stories, not through hard evidence.”

Uber’s prime ride-hailing competitor in the U.S., Lyft, has said it plans to seek permits to operate robot taxis in San Francisco, although no date has been set. Lyft did not return requests for comment on Sunday’s fatality.

Waymo, the driverless vehicle technology arm of Google’s Alphabet, in January received permits to begin offering robot taxi service in Arizona. Waymo has been running driverless vehicle tests in and around Phoenix for months and had planned to begin a full-scale robot taxi service within the next several weeks. Waymo did not respond to a request for comment.

Uber has had minor incidents in the past. A self-driving Uber car ran a red light in San Francisco while the company operated in the city without regulatory approval. The California DMV eventually forced Uber to pull the cars from the road.


The most widely reported fatality involving self-drive technology occurred in 2016, when a man driving a Tesla Model S in Florida was decapitated when his Autopilot system failed to detect a semi-truck crossing in front of it.

Tesla’s Autopilot system, however, is not yet considered fully driverless. After the Florida crash, the company updated Autopilot to shorten the time a driver could go hands-free before grabbing the steering wheel again.

Tesla plans to issue software to enable driverless operation later this year. Tesla did not respond to a request for comment.

The National Transportation Safety Board is opening an investigation into the Uber death and is sending a small team of investigators to Tempe, spokesman Eric Weiss said.

Bloomberg contributed to this report.

Bloomberg and Los Angeles Times staff writer Lauren Raab contributed to this report.



3:20 p.m.: This article was updated with reaction and analysis.

10:25 a.m.: This article was updated throughout with additional details.

This article was originally published at 10:15 a.m.