The fatal crash of a Tesla electric car using an autopilot feature still in beta testing -- and never reviewed by regulators -- highlighted what some say is a gaping pothole on the road to self-driving vehicles: the lack of federal rules.
Automakers do not need to get the technology approved by the National Highway Traffic Safety Administration before rolling it out to the public. They just have to attest that their vehicles meet federal safety standards -- yet there still are no such standards for autonomous driving features.
That enables carmakers to, at their own discretion, roll out features when they deem them to be ready to hit the road. In a lengthy statement Thursday, Tesla acknowledged that its autopilot mode was "new technology and still in a public beta phase," and said that "as more real-world miles accumulate … the probability of injury will keep decreasing."
But critics, lawmakers and safety advocates say carmakers should not be using their customers as guinea pigs. They're questioning whether the companies are moving too fast and say regulators should put the brakes on a nascent technology that might have been rolled out too hastily.
"The Tesla vehicles with autopilots are vehicles waiting for a crash to happen -- and it did in Florida," said Clarence Ditlow, executive director of the Center for Auto Safety. He is calling for the company to issue a recall and disable the autopilot function until NHTSA issues safety guidelines.
"If you don't have a radar system on a car that can tell you there's a truck ahead of you, there's a problem," he said.
On the afternoon of May 7, in clear, dry conditions, a Tesla Model S driven by 40-year-old Joshua Brown slammed into a tractor trailer attempting to turn in front of it in Williston, Fla.
The accident sheared the roof off the car, which skidded under the truck and off the road, plowing through two wire fences before crashing into a utility pole, the accident report said. Brown, a former Navy Seal from Canton, Ohio, was pronounced dead at the scene.
Tesla and other automakers say that ultimately, self-driving cars and autonomous features will make driving safer and that eventually cars will be able to drive themselves better than humans can.
As it is now, the cars are able to manage normal driving conditions well, but have trouble in inclement weather or in unfamiliar driving situations, such as when bicyclists zoom by. In the case of the Tesla fatality, the company blamed the white side of the tractor trailer against a brightly lighted sky, which it said confused the car.
NHTSA is investigating the accident, thought to be the auto industry's first fatality involving an autonomous driving feature.
"This tragic incident raises some serious concern about whether current autonomous vehicle technology is ready for use on public roads,” said Sen.
Transportation Secretary Anthony Foxx, who oversees NHTSA, is expected to issue guidelines for autonomous vehicles this month. He and NHTSA chief Mark Rosekind have touted the potential for the technology to reduce traffic fatalities.
Those guidelines were expected to focus on fully self-driving cars, which are different from the Tesla Model S that was involved in the fatal crash; that car was simply equipped with an autopilot mode. But the Florida accident could lead transportation officials to add guidelines for autonomous features such as autopilot, said Karl Brauer, senior analyst at Kelley Blue Book.
"The manufacturer is kind of able to roll it out with whatever limitations and guidance options they want," Brauer said. "We might see a shift to having more aggressive guidelines for those driver-assistance functions as well."
Tesla's autopilot feature -- which must be turned on by the driver -- uses cameras, radar and sensors to steer the vehicle, change lanes and adjust speed based on traffic, the company said.
On Friday, the Palo Alto company said autopilot hardware is included in all its vehicles and that it was "continually advancing the software through robust analysis of hundreds of millions of miles of driving data."
"We do not and will not release any features or capabilities that have not been robustly validated in-house," a spokesperson said.
In owners manuals, corporate filings and other documents, Tesla makes clear that it believes drivers are responsible for maintaining control of their vehicles when using autopilot features.
In the owners manual for the Tesla Model X, for instance, the company notes that it is the driver's responsibility "to stay alert, drive safely and be in control of the vehicle at all times." It goes on to say drivers should "be prepared to take corrective action" and that "failure to do so can result in serious injury or death."
Owners manuals also say drivers should always keep hold on the steering wheel, even when using autopilot. The question, though, is whether those warnings will provide enough protection to shield Tesla from liability when autopilot causes or fails to avoid collisions.
Greg Keating, a law professor at USC, said they might not.
If a Tesla owner sued, he said, the owner might argue that he or she had a reasonable expectation that the autopilot system would live up to its name, regardless of the warnings in the manual.
Product liability claims could be costly for Tesla, which notes in Securities and Exchange Commission filings that the company -- not an insurer -- would be on the hook to pay any such claims.
"Any product liability claims will have to be paid from company funds, not by insurance," the company notes in the "risk factors" section of its latest annual report.
In the absence of federal guidelines, several states have developed their own, leading to an inconsistent set of rules.
Since 2014, California has required manufacturers testing autonomous vehicles to submit detailed information, including the number of self-driving cars the companies are testing and the drivers who are testing them, Department of Motor Vehicles spokeswoman Jessica Gonzalez said.
So far, 111 autonomous vehicles from 14 manufacturers have been approved for testing on California roads.
Under California regulations, Tesla's autopilot feature is not considered autonomous technology because it does not have the ability to drive without the "active physical control or monitoring" by a human operator, according to the DMV.
Brown, who called his car Tessy, had posted YouTube videos praising the autopilot function. Tesla Chief Executive Elon Musk touted one video on Twitter that was posted a month before the accident. In several of the videos, Brown's hands were not on the wheel when the car was in motion.
"I have always been impressed with the car, but I had not tested the car's side collision avoidance," Brown wrote in comments posted April 5 with the video, which showed the technology saving him from being sideswiped by a truck. "I am VERY impressed. Excellent job Elon!"
The technology has been used in 130 million miles of driving without a fatality, Tesla said.
The driver of the tractor-trailer in the accident, Frank Baressi of Palm Harbor, Fla., told the Associated Press that Brown was "playing 'Harry Potter' on the TV screen" in the car when the crash took place. He acknowledged he couldn't see the movie playing and that he only heard the audio.
Baressi did not respond to a voicemail requesting comment.
Sgt. Kim Montes, a spokeswoman for the Florida Highway Patrol, told The Times that a portable DVD player was found in the Tesla but said she did not know whether it had been in use.
"At the time of the impact, we don't know what the status of that DVD player was. Investigators are looking into it," she said. "We may never know."
In a February conference call, executives touted the autopilot feature, saying early Tesla customers whose leases were about to expire were interested in leasing new models equipped with autopilot.
"Autopilot is certainly one of the core stories of what's going on here at Tesla," said Jonathan McNeil, the automaker's president of global sales. "It's really exciting."
Times staff writers James Rufus Koren, Samantha Masunaga and Tracey Lien contributed to this report.