A Florida man sued Tesla Inc. on Tuesday, saying the vehicle’s semiautonomous Autopilot feature failed to detect a disabled car on a highway, leading to a crash that left him with permanent injuries.
Shawn Hudson said in the lawsuit filed in state court in Orlando, Fla., that Tesla made false statements about the safety of the Autopilot feature on his Tesla Model S. It’s the second such lawsuit in as many months: A Utah driver filed a similar complaint last month.
Hudson and his attorney told reporters that Tesla lulls drivers into a false sense of security, causing them to believe the cars can drive themselves when the Autopilot function is used. But when there is a danger on the road, they said, drivers have no time to react.
The Palo Alto automaker has repeatedly called Autopilot an “assist feature.” It has said that while using Autopilot, drivers must keep their hands on the wheel at all times and be prepared to take over if necessary.
According to attorney Mike Morgan, the company’s message is: “We told you, we’re going to drive you. Don’t worry about the road, watch it, but we’re also going to put this giant 20-inch screen right here with web-browsing capabilities so you can be distracted the entire time, but if you crash, that’s your fault.”
Hudson said he suffers pain from fractured vertebrae and has some cognitive problems since the crash two weeks ago on the Florida Turnpike.
Hudson, who lives in Orlando and has a two-hour commute to Fort Pierce for his job as the general manager of a Nissan dealership, said the Autopilot feature appealed to him because he could get some work done during his commute. Hudson had his hands on the wheel as the car traveled 80 mph, but he also was looking at his phone in the moments before his Model S slammed into an unoccupied Ford Fiesta, he said.
“I was looking up, looking down, looking up, looking down, and I look up and the car is disabled in the passing lane,” Hudson said. “When you’re traveling that fast, it’s like hitting a wall.”
A Tesla spokeswoman said in an email that there’s no reason to believe the Autopilot feature malfunctioned and that drivers should always maintain control of the vehicle when using Autopilot. The spokeswoman, who didn’t want her name used, said the car was incapable of transmitting log data to Tesla, which prevented the company from reviewing what happened in the accident.
“Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not,” the spokeswoman said.
Hudson’s attorneys said there is a disconnect between the official company policy and what salespeople tell customers in showrooms.
“What they say to federal regulators is very different from what you hear on the Tesla lot,” Morgan said. “It’s very different from what Mr. Hudson was told when he was buying his car.”
Tuesday’s lawsuit puts Tesla’s driver-assistance system back in the headlines for legal reasons, rather than for a development that Chief Executive Elon Musk would prefer receive the attention: a major Autopilot software update.
The company announced last week what it called “our most advanced Autopilot feature ever,” dubbed Navigate on Autopilot, which guides Tesla cars from highway onramps to offramps.
Scrutiny of Tesla Autopilot was at a fever pitch earlier this year when multiple crashes involving the system — including that of a Model X SUV whose driver died after the vehicle hit a highway barrier in March — drew investigations by U.S. safety agencies.
Bloomberg was used in compiling this report.
3:35 p.m.: This article was updated throughout with additional details.
This article was originally published at 10:50 a.m.