Federal regulators opened a preliminary probe into the autopilot feature on a Tesla Model S electric car after a fatal crash involving the technology, Tesla said Thursday.
The fatality – thought to be the first in the auto industry related to an autopilot feature – sparked questions about the limitations of the technology and its place in what is seen as an inevitable march toward self-driving vehicles. It followed other recent incidents in which drivers reported collisions while using such technology.
The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.
In a blog post, Tesla Motors Inc. said the 2015 car passed under the trailer, with the bottom of the trailer hitting the Model S’ windshield.
“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.
“Autopilot is getting better all the time but it is not perfect and still requires the driver to remain alert,” the Palo Alto-based company said in the post.
Tesla noted that the autopilot technology comes disabled and requires “explicit acknowledgement” from the driver before activation that the system is still in a public beta phase.
When autopilot is activated, “the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’”
The company noted that when drivers activate the feature, “the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’ “
Analysts generally agreed that the Tesla fatality would be more a wake-up call to motorists that the autopilot features are fallible rather than a major hit against Tesla’s brand reputation.
“We do not yet have fully autonomous cars,” said Karl Brauer, a senior analyst at Kelley Blue Book. “It might be this tragic event starts a kind of movement of educating consumers.”
NHTSA said it would evaluate “the design and performance of automated driving systems in the Tesla Model S” along with investigating the vehicle and the crash scene.
The agency said it also would “gather additional data regarding this incident and other information regarding the automated driving systems.”
Scott Galloway, founder of the brand research firm L2, said the momentum of self-driving technology likely will continue despite the tragedy.
“Unless statistics show this is not only as dangerous but more dangerous than traditional modes of driving, I don’t think you’re going to see a slowdown” in using the technology, he said.
George Peterson, president of the research firm AutoPacific Inc. in Tustin, said the issue of the Tesla Model S’ sensors not picking up the white truck during a bright day is similar to issues involving autonomous vehicles and snow, as it can be difficult to determine the center line of the road or lane markers.
“Those are technology issues that the manufacturers are trying to sort out right now,” Peterson said. “They’re making a lot of progress really quickly, but it’s not a foolproof system, as this shows.”
He said adding beta options in cars such as Tesla’s autopilot feature is “pretty unusual” but that Tesla adds disclaimers before use. Peterson also said that he didn’t think the crash would deter Tesla customers.
“Tesla seems to be a Teflon company,” he said. “Even though Tesla doesn’t sell a tremendous amount of cars, their stock price is hugely high, and there’s a huge group of Tesla fans out there. I don’t think it’ll put them off – I think it’ll make them more cautious when they’re using autopilot.”
Tesla’s stock price, which closed at $212.28 on Thursday, fell 2.5% in after-hours trading.
The Model S is Tesla’s mainstay vehicle so far, and the sedan accounted for most of the 50,580 vehicles Tesla delivered last year.
Rob Enderle, president and principal analyst at technology strategy firm Enderle Group, said the fatality was “a reminder that cars on the road haven’t been deployed as self-driving cars, and people shouldn’t be driving them hands-off at freeway speeds on the open road.”
“Autopilot as it’s positioned is the next generation of cruise control where the driver is supposed to stay engaged, where you can’t depend on this to drive the car for you,” he said. “This could’ve been easily driver error. This isn’t a self-driving car.”
One motorist, Alex Roy, said he and two friends drove a Tesla Model S from Los Angeles to New York in 2015 and spent 96% of the drive on autopilot. Roy, an New York City editor at large at TheDrive, a car website, said the only time he felt unsafe was when he took his hands off the wheel for too long. He blamed himself for that.
“I found myself getting overconfident in the system,” said Roy, adding that Tesla makes it clear that the car cannot operate completely autonomously but that it’s tempting to let go of the wheel because the system generally works so well.
“I took my hands off the wheel [at one point] and thought the car was still on autopilot and it wasn’t, and I almost hit something,” he said.
The need to maintain control became clear to Aaron Souppouris when he test-drove a Model S in April. Souppouris, a senior editor at the blog Engadget, said Tesla loaned him the car for an article about the autopilot feature and he drove it about 500 miles around England.
There were times at night, he said, when the car went back and forth within a lane and seemed “skittish,” he said.
Once on autopilot, the car tried to change lanes but then reverted back suddenly, and another time it disengaged the autopilot mode in the middle of a lane switch, Souppouris said. The car did better during the day than at night, he said.
Times staff writers Natalie Kitroeff and Paresh Dave contributed to this report.
3:40 p.m.: This article was updated with comments from George Peterson, Alex Roy and Aaron Souppouris.
2:55 p.m.: This article was updated with additional details.
This article was originally published at 2:04 p.m.