Police late Wednesday released a video that shows an Uber robot car running straight into a woman who was walking her bicycle across a highway in Tempe, Ariz. The woman was taken to a hospital, where she died Sunday night.
The video, shot from the car, is sure to raise debate over who’s to blame for the accident.
In the video, the victim, Elaine Herzberg, 49, appears to be illegally jaywalking from a median strip across two lanes of traffic on a dark road. But she was more than halfway across the street when the car — traveling about 40 mph, according to police — hit her. The car did not appear to brake or take any other evasive action.
Meantime, the Uber employee who’s sitting behind the wheel of the self-driving test car is seen gazing at his lap before looking up in shock around the time of impact.
Experts say any of several pieces of the driverless system may have failed, from lidar and radar “eyes,” to the logic system that’s supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car’s automatic braking system itself.
Driverless car experts from law and academia called on Uber to release technical details of the accident so objective researchers can help figure out what went wrong and relay their findings to other driverless system makers and to the public.
It’s “important ... that we all learn from this accident and we make these technologies even better. To that end Uber must release all of the data leading up to this crash,” said Alain Kornhauser, who heads the autonomous driving program at Princeton University.
Bryant Walker Smith, a law professor and driverless specialist at the University of South Carolina, said: “Although this appalling video isn't the full picture, it strongly suggests a failure by Uber's automated driving system and a lack of due care by Uber's driver as well as by the victim.”
He noted that Waymo and General Motors have joined the federal government’s voluntary safety self-assessment program for driverless cars, but Uber has not.
An Uber spokesperson declined to say whether Uber will release any crash data to the public. Uber, which also tests autonomous semi-trucks in Arizona, has temporarily halted all driverless testing.
In a statement, Uber said: “The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones. Our cars remain grounded, and we're assisting local, state and federal authorities in any way we can.”
On Monday, the San Francisco Chronicle quoted Tempe Police Chief Sylvia Moir saying: “It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway…. I suspect preliminarily it appears that the Uber would likely not be at fault in this accident, either.”
The Times asked a Tempe police spokesman whether the chief still suspects Uber is not at fault. The department has yet to respond.
Arizona Gov. Doug Ducey has made driverless car deployment a key pillar of the state’s economic development efforts.
The matter is being investigated in detail by the National Highway Traffic Safety Administration and the National Traffic Safety Board.
“What we now need is for the release of the radar and lidar data,” Princeton’s Kornhauser said in an email. (Lidar is a sensing technology that uses light from a laser.) “Obviously, the video of the driver is extremely bad for Uber and probably implies that Uber should suspend all of its ‘self-driving’ efforts for a while if not for a very long while.
“The ‘self-driving’ systems are supposed to have ‘professional’ overseers who are really supposed to be paying attention during these ‘tests’. Apparently Uber didn’t make it clear in this case.”
Kornhauser questioned the police description of a situation that would have been difficult to avoid. He said Uber should reveal what its collision-avoidance software was doing during the couple of seconds before impact.
“The front-facing video suggests that this person was crossing the lane at a slow speed and should have been noticed by the system in time to at least apply the brakes, if not stop the vehicle completely,” he said. “While a human may not have been able to avoid this crash, a well-designed, well-working collision avoidance system should have at least begun to apply the brakes.”