Federal investigators looking into Tesla Motors Inc.'s autopilot system after a fatal crash in Florida are zeroing in on the limitations of the feature and how it reacts when obstacles cross its path.
The National Highway Traffic Safety Administration on Tuesday posted a nine-page letter seeking information from the electric car maker about autopilot — which Tesla has said is an “assist feature,” not a substitute for an alert driver — and about why the feature failed to detect a tractor-trailer that crossed in front of a Model S sedan May 7 in Florida.
The crash in Williston, Fla., killed the Tesla car’s driver, former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla, which collects data from its cars wirelessly, says the cameras on Brown’s Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and the car didn’t automatically brake.
The safety agency also asked Tesla for its reconstruction of the Brown crash, and for details of all known crashes, consumer complaints and lawsuits filed or settled because the autopilot system didn’t brake as expected.
NHTSA said Tesla must comply with its request by Aug. 26 or face penalties of up to $21,000 per day, to a maximum of $105 million.
A spokesman said the agency hasn’t determined if a safety defect exists with autopilot. The information request is a routine step in an investigation, spokesman Bryan Thomas said.
Tesla released autopilot last fall. Some safety advocates have questioned whether the company — which says the system is still in “beta” phase, a computer industry term for software testing by customers — and NHTSA allowed the public access to the system too soon.
“No safety-significant system should ever use consumers as test drivers on the highways,” said Clarence Ditlow, head of the nonprofit Center for Automotive Safety. He said NHTSA lacks the electronic engineers and laboratories needed to keep up with advanced technology.
Tesla says that autopilot has been safely used in more than 100 million miles of driving by customers and that data show drivers who use autopilot are safer than those who don’t.
The investigation, opened June 28, could have broad implications for the auto industry and its path toward self-driving cars. If the probe finds defects with Tesla’s system, the agency could seek a recall. Other automakers have or are developing similar systems that may need to be changed because of the probe, which also could affect self-driving car regulations to be unveiled this summer.
In the letter, NHTSA also asked Tesla for details on any modification to the autopilot system that Tesla has made.
4:44 p.m.: This article was updated throughout with additional information.
This article was originally published at 9:40 a.m.