Advertisement

Uber details why its driverless SUV killed a pedestrian and how it’s working to fix safety problems

Share
The Washington Post

More than seven months after a self-driving Uber SUV killed a pedestrian in Arizona, the company has released safety reports that detail broad technological and management failings and describe efforts since the tragedy to address them.

The findings released Friday reveal shortcomings at the core of Uber’s driverless technology itself, which relies on cameras and sensors to take in the environment and software to process that information and make all the decisions — big and small — needed to drive safely.

A key internal recommendation cited the need for “improving the overall software system design,” which is akin to saying Uber’s robot car needed a better brain with sharper thinking.

Advertisement

In practice, that means that since the fatal crash in Tempe, Ariz., in March, company engineers have worked at “reducing latency,” or the delay between when an initial observation is made and when an action is taken in response, Uber said. “We are now able to detect objects and actors sooner and execute safe reactions faster,” Uber said.

The driverless system also more quickly obtains “accurate velocity measurements for actors moving suddenly or erratically,” it said.

Uber expressed contrition and an eagerness to work with others in the industry to improve safety across the board.

“The competitive pressure to build and market self-driving technology may lead developers to stay silent on remaining development challenges,” Uber Chief Executive Dara Khosrowshahi wrote as part of a new safety assessment released Friday. He said that Uber wants to join competitors in finding ways to “measure and demonstrate” driverless performance and that he hopes to encourage “a culture of transparency rooted in safety.”

Whether Uber can transform itself into a safety leader is far from clear.

“It would be fantastic if they did that. But they’d have to do it seriously,” said Joan Claybrook, who was head of the National Highway Traffic Safety Administration under President Carter. “From when they started, safety was never No. 1 on their list. It was to get the vehicles on the road.”

There are no federal safety standards for driverless vehicles. Claybrook says driverless testing should not be done on public roads, a risk Uber and other major firms say is necessary to provide the diverse, real-world conditions needed to train expert computer drivers and create safer roads in the future.

Advertisement

“I don’t think the public should be guinea pigs,” Claybrook said.

In Arizona, Uber was pushing the limits as it scrambled to catch up with self-driving firms such as Alphabet Inc.’s Waymo, which emerged from Google’s nearly decade-old self-driving car project.

Federal investigators say Uber employees intentionally disabled the automatic-braking features on their specially outfitted Volvo XC90 so that it wouldn’t slow down erratically during the testing in Tempe. The car detected pedestrian Elaine Herzberg six seconds before hitting her but misidentified her as an unknown object, a vehicle and then a bike, according to the National Transportation Safety Board. It also misjudged where she would go.

Uber’s backup safety driver had been streaming “The Voice” on her phone and didn’t start braking until after Herzberg, 49, was struck, the NTSB said.

The San Francisco company pulled its driverless cars from public roads and said they would not return until internal and external safety reviews were completed and the company made necessary improvements.

Investigators say Herzberg was pushing a bike across a darkened boulevard, outside a crosswalk, when Uber’s Volvo hit her. Onboard video shows Herzberg looking back toward the Volvo only a moment before it reached her. Uber reached an undisclosed financial settlement with Herzberg’s family.

Noah Zych, Uber’s head of system safety, said the company is “raising the bar” for what its self-driving cars must prove before returning to public roads. To do so, he said, Uber is putting them up against extremely challenging scenarios on test tracks.

Advertisement

“What happens if a vehicle pulls out in front of us at the very last minute? Or a bicycle runs a stop sign? Or a person comes out from behind a parked car?” Zych said.

“A lot of human drivers, I think, would struggle with consistently passing those tests. We’re working to make sure our system passes those tests as well,” he said.

Uber is also completing fixes that will allow the automatic emergency braking system on its cars to be used during driverless testing, Zych said. That reverses its approach in Tempe.

Zych said the goal is to earn trust by being transparent and making the safety improvements the company promises.

“But we recognize that just saying that isn’t going to necessarily be compelling,” he said. “Public sentiment and trust is also something that doesn’t come back, or come at all, overnight.”

Overall, Uber said it has incorporated a “new approach to handling uncertainty,” such as whether a vehicle will yield the right of way, “enabling the system to reason over many possible outcomes to ultimately come to a safe response.”

Advertisement

There’s also improved “object and actor detection” for ambiguous situations in which there’s low visibility or views are blocked.

But the company says it still has a long way to go.

As for the management and oversight problems that allowed Uber to deploy flawed or not-fully-formed technology on public roads, the company says it has made progress.

It has upped the “cadence of executive-level reviews,” meaning senior leaders from Khosrowshahi on down are paying closer attention to self-driving safety, according to the company. Uber executives say that safety is now their core value, and that they have established an independent safety team “to provide appropriate checks and balances.”

Uber said it created an anonymous reporting system so employees can raise safety concerns, and they’ve already been using it.

It also laid off hundreds of backup drivers who were supposed to be the human safety net protecting the public from risks with the developing technology. They’ve been replaced with “mission specialists” with more rigorous training, the company said. They are also subject to a “third-party driver monitoring system.”

But creating a deeper “safety culture” in an organization is time-consuming, according to an external inquiry commissioned by Uber and led by the law firm LeClairRyan and former NTSB Chairman Christopher Hart.

Advertisement

“A safety culture is not something that springs up ready-made from the organizational equivalent of a near-death experience,” the report concluded, quoting a prominent British expert in human error, James Reason. Instead, it emerges gradually from “practical and down-to-earth measures” and from “a process of collective learning.”

Uber has filed an application with the Pennsylvania Department of Transportation to resume testing its driverless vehicles in Pittsburgh.

Laris writes for the Washington Post. The Associated Press was used in compiling this report.

Advertisement