Editorial: It’s time to tap the brakes on self-driving cars
Carmakers and tech companies are in a race to put autonomous vehicles on the road, and it’s time for regulators to tap the brakes.
This month the National Highway Traffic Safety Administration revealed that it is investigating two crashes involving Tesla vehicles allegedly operating on autopilot. Tesla’s autopilot feature is a semi-autonomous system that uses cameras, radar and sensors to steer the car, change lanes, adjust speed and even find a parking space and parallel park. It’s not supposed to turn a Tesla sedan into a self-driving car, but there’s ample evidence on YouTube of people driving with their hands off the steering wheel, playing games and even climbing into the back seat while their car is hurtling down a freeway.
Although Tesla has been far more aggressive than its rivals in making cutting-edge driverless technology readily available to its customers, other automakers aren’t far behind in rolling out advanced “driver assist” systems. Yet there are still no rules governing the use of this sort of technology — whether partially autonomous, like autopilot, or fully self-driving, like Google’s steering-wheel-less prototype. And at this point, there are no standardized tests the cars are required to pass before regular folks take them on the road. Who gets to decide when an autonomous car is ready for the public? Current policies let the car’s manufacturer make that call, restrained only by the fear of liability.
Regulators must intervene. The technology is already being deployed, and it’s time to set standards for when an autonomous-driving feature has been tested enough and is considered safe enough for widespread use. Public roads shouldn’t be uncontrolled laboratories for vehicle safety experiments.
But this is no easy job. There is immense pressure from driverless-car supporters and safety advocates to get more autonomous technology on the road as soon as possible because, at the end of the day, self-driving cars will probably be much safer than cars driven by erratic, distracted humans. (More than 90% of crashes are caused by human error.) Transportation safety regulators, as well as manufacturers, have to figure out how to do more real-world, independently verified stress-testing to hone the technology without people dying in the process. If that means slowing the rush to roll out driverless cars, that’s OK.
California lawmakers directed the state in 2012 to develop rules to allow the testing and eventual use of driverless cars, but because of the issue’s complexity and the shortage of precedents, the state is already a year and a half behind schedule. Draft regulations issued late last year sounded logical at the time — because autonomous vehicles are still so new, the state would require licensed drivers to stay behind the wheel, ready to take over if the system failed. The problem, as the fatal Tesla autopilot crash demonstrates, is that drivers are not a reliable backup. They learn to trust the car, perhaps too quickly and too much; they let their guard down and may not be prepared to act in a split second to prevent a crash. California ought to reconsider whether requiring a driver behind the wheel makes an autonomous vehicle safe enough for the public roadways.