Advertisement

U.S. ends investigation of fatal Tesla crash and finds ‘no safety defects’ in car’s Autopilot

Share via

Drivers need to pay attention while driving, even as technology gradually takes over the task.

That’s the message U.S. safety regulators delivered Thursday after closing an investigation into a fatal Tesla crash in Florida last year involving the vehicle’s Autopilot system. The National Highway Traffic Safety Administration concluded that the driver-assist software in the Tesla Model S had no safety defects and declined to issue a recall.

The safety board also studied less-serious Tesla crashes and said it didn’t find “any incidents in which the systems did not perform as designed.”

Advertisement

But while completely self-driving vehicles may be on the way, automatic braking, adaptive cruise control and other driver-assist technologies now on the market still require “continual and full attention of a driver,” NHTSA spokesman Bryan Thomas said in a conference call.

NHTSA said Tesla “fully cooperated” with the investigation and its requests for data. Tesla, in a prepared statement, said: “The safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion.”

Hod Lipson, professor of mechanical engineering at Columbia University, said NHTSA’s finding were “a vindication not only of Tesla, but of the entire self-driving car industry.”

Advertisement

“Yes, driverless cars are going to have accidents. But they’re going to have fewer accidents than humans,” he said. “And unlike humans, driverless cars are going to keep getting better, halving the number of accidents per mile every so many months. The sooner we get on that exponential trajectory, the better.”

The Florida crash in May drew worldwide attention. The Model S electric sedan, with its Autopilot engaged, drove under a big-rig truck making a left-hand turn across a highway. The car drove under the trailer, killing its driver, 40-year-old Joshua Brown of Canton, Ohio. The truck driver told police he heard a Harry Potter movie playing in the crushed automobile after the crash.

The incident led critics to question whether automated driving technology is ready for highway deployment. However, it represents the only fatality involving Tesla’s Autopilot to date, and Tesla Chief Executive Elon Musk has repeatedly insisted that there are fewer crashes per miles driven on Autopilot than with miles driven by a human.

Advertisement

The federal agency also investigated a Pennsylvania crash involving Autopilot that caused injuries, as well as “dozens” of other Tesla crashes, and found no system defects.

The agency pored over data on Tesla crashes in which air bags were deployed while Autopilot was engaged. Many of the crashes, NHTSA said, involved “driver behavior factors,” including distraction, driving too fast for conditions and “mode confusion,” when a car and the driver share driving tasks.

The safety board said Tesla owner manuals and on-screen instructions make clear that the human driver alone is responsible for driving the car.

But, the agency said, manufacturers need to pay attention to how drivers actually use the technology, not just how they’re supposed to use it, and to design vehicles “with the inattentive driver in mind.” And companies need to do a better job of educating drivers on system limitations, such as training sessions at dealerships. “It’s not enough to just put it in the owners manual,” Thomas said.

For example, not all its drivers may be aware that Tesla’s automated braking system is intended to help avoid rear-end collisions, not trucks crossing a highway.

The closure of the investigation without a recall “helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren’t attentive, it’s not the technology’s fault,” said Karl Brauer, auto analyst at Kelley Blue Book. That will help avoid the stigma that the technology causes accidents, he said.

Advertisement

However, one group criticized the agency’s findings. Consumer Watchdog, based in Santa Monica, said that “NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the ‘Autopilot’ technology and Tesla’s aggressive marketing.”

Officials at NHTSA believe that automated driver-assist technologies and ultimately self-driving cars will lead to fewer crashes and traffic fatalities. In its report, it noted a 40% drop in crashes after an automatic steering feature was included in Tesla cars.

Bryant Walker Smith, an autonomous vehicle law expert at the University of South Carolina, said “the decision shows that data matter, and that those data are increasingly available. I’d expect companies to be increasingly attentive to the data they collect and how they present those data publicly and to NHTSA.”

He added, though, that the investigation is hardly the final word in determining fault when humans and robots share driving duties. The investigation “would have looked different if the vehicle in question had [an] automated driving system that promised to replace the human driver rather than … a system that merely promises to supplement that driver.”

The Associated Press was used in compiling this report.

ALSO

Advertisement

Where will Trump begin in slashing Obama-era regulations?

Feds sue nation’s largest student loan servicer, accusing it of cheating borrowers

Steven Mnuchin’s OneWest favored private equity firms, did little small-business lending


UPDATES:

3:10 p.m. This article was updated with additional quotes and background.

1 p.m. This article was updated throughout with Times staff reporting.

10:25 a.m.: This article was updated with confirmation from NHTSA and with comments from an NHTSA spokesman and from Tesla.

9:50 a.m.: This article was updated with additional background information.

This article was originally published at 9 a.m.

Advertisement
Advertisement