Advertisement

Editorial: It’s time to tap the brakes on self-driving cars

Share

Carmakers and tech companies are in a race to put autonomous vehicles on the road, and it’s time for regulators to tap the brakes.

This month the National Highway Traffic Safety Administration revealed that it is investigating two crashes involving Tesla vehicles allegedly operating on autopilot. Tesla’s autopilot feature is a semi-autonomous system that uses cameras, radar and sensors to steer the car, change lanes, adjust speed and even find a parking space and parallel park. It’s not supposed to turn a Tesla sedan into a self-driving car, but there’s ample evidence on YouTube of people driving with their hands off the steering wheel, playing games and even climbing into the back seat while their car is hurtling down a freeway.

In May, a driver died in Florida when his Tesla Model S sedan on autopilot slammed into a tractor-trailer that had turned across the road in front of him. The collision sheared the top off the car, which didn’t stop driving until it had plowed through two wire fences and hit a utility pole. Tesla said that neither the autopilot nor the driver noticed the white side of the tractor trailer against a brightly lighted sky, and so the brake was not applied. The second accident happened when a Tesla sport utility vehicle hit a guardrail on the Pennsylvania Turnpike, crossed traffic and rolled over. The driver told state police that he was in autopilot mode; the cause is still under investigation.

Advertisement

Although Tesla has been far more aggressive than its rivals in making cutting-edge driverless technology readily available to its customers, other automakers aren’t far behind in rolling out advanced “driver assist” systems. Yet there are still no rules governing the use of this sort of technology — whether partially autonomous, like autopilot, or fully self-driving, like Google’s steering-wheel-less prototype. And at this point, there are no standardized tests the cars are required to pass before regular folks take them on the road. Who gets to decide when an autonomous car is ready for the public? Current policies let the car’s manufacturer make that call, restrained only by the fear of liability.

Regulators must intervene. The technology is already being deployed, and it’s time to set standards for when an autonomous-driving feature has been tested enough and is considered safe enough for widespread use. Public roads shouldn’t be uncontrolled laboratories for vehicle safety experiments.

But this is no easy job. There is immense pressure from driverless-car supporters and safety advocates to get more autonomous technology on the road as soon as possible because, at the end of the day, self-driving cars will probably be much safer than cars driven by erratic, distracted humans. (More than 90% of crashes are caused by human error.) Transportation safety regulators, as well as manufacturers, have to figure out how to do more real-world, independently verified stress-testing to hone the technology without people dying in the process. If that means slowing the rush to roll out driverless cars, that’s OK.

This month NHTSA is supposed to release guidelines to manufacturers for the safe operation of fully autonomous vehicles. The agency has said rigorous testing and ample data on performance are necessary, but the agency’s guidelines are expected to be suggestions, not mandates, because NHTSA needs the flexibility to respond to a rapidly evolving industry. Until the federal government sets testing and performance standards for driverless technology, states will be left to come up with their own policies on when and how to allow autonomous vehicles, potentially resulting in a patchwork of laws that confuses consumers and confounds carmakers.

California lawmakers directed the state in 2012 to develop rules to allow the testing and eventual use of driverless cars, but because of the issue’s complexity and the shortage of precedents, the state is already a year and a half behind schedule. Draft regulations issued late last year sounded logical at the time — because autonomous vehicles are still so new, the state would require licensed drivers to stay behind the wheel, ready to take over if the system failed. The problem, as the fatal Tesla autopilot crash demonstrates, is that drivers are not a reliable backup. They learn to trust the car, perhaps too quickly and too much; they let their guard down and may not be prepared to act in a split second to prevent a crash. California ought to reconsider whether requiring a driver behind the wheel makes an autonomous vehicle safe enough for the public roadways.

Follow the Opinion section on Twitter @latimesopinion and Facebook

Advertisement
Advertisement