Column: Report: Tesla’s new Autopilot feature is dumber than human drivers
The Holy Grail of self-driving technology, so its promoters tell us, is to replace the error-prone idiot behind the wheel with a flawlessly discerning and safer system of sensors and software.
By that standard, a new version of Tesla’s Navigate on Autopilot self-driving system is a big step back. That’s the opinion of Consumer Reports, which road-tested a new optional Navigate on Autopilot feature allowing Tesla vehicles to make lane changes automatically, without the participation of the driver.
Tesla says the feature makes for “a more seamless active guidance experience.” Consumer Reports begs to disagree. In a review posted Wednesday, it says “it observed the opposite in its own tests of the feature, finding that it doesn’t work very well and could create potential safety risks for drivers.”
The feature cut off cars without leaving enough space and even passed other cars in ways that violate state laws.
All told, CR says, the feature is “far less competent than a human driver.” Said Jeff Fisher, the magazine’s senior director of auto testing, “The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around.”
The feature aims to give drivers one more bit of respite from managing their cars’ operations by allowing the cars to initiate a lane change without requiring driver confirmation. But in CR’s experience, “the feature cut off cars without leaving enough space and even passed other cars in ways that violate state laws.… As a result, the driver often had to prevent the system from making poor decisions.”
Statistically speaking, however, that’s a flawed argument. Tesla’s 66 million miles of experience, much less its 9 million lane changes, aren’t remotely enough to validate any safety claims that purport to compare to the experience of human drivers. As the Rand Corp. observed in 2016, American motorists drive an average of 3 trillion miles per year and fatalities are relatively rare — the 32,800 deaths annually on U.S. roads amount to only 1.09 per 100 million miles.
Establishing to a statistical near-certainty that driverless cars would reduce vehicular fatalities by even 20% would require 5 billion miles of road testing — a record that would take a fleet of 100 test vehicles 225 years to complete if they operated at an average of 25 miles per hour, 24 hours a day and 365 days a year. In other words, a sample of 66 million miles proves nothing.
CR’s experience is another data point suggesting that the gap between today’s driver-assistance features — such as automatic braking on cruise control and assisted lane changing — and fully automated driving without driver participation may be much greater than is estimated by promoters of autonomous vehicles.
There are signs that some Autopilot users may vest the system with more control over their vehicles than is wise: Autopilot was engaged during three fatal Tesla crashes, including on March 1, when a driver was killed in a collision with a semitrailer. There are no indications that either the driver or the Autopilot system took action to avoid the trailer, according to the National Transportation Safety Board.
Tesla cautions that “drivers should always be attentive when using Autopilot”; its standard system has required drivers to confirm lane changes via the turn signal stalk and in some cases requires drivers to have their hands on the steering wheel.
The mild setting allows lane changes when the car is traveling at a speed significantly slower than the cruise control setting; “Mad Max” when the car is just a bit slower than the cruise control speed. That’s not necessarily imprudently fast, but calling it “Mad Max” conjures up the wild-eyed dune-buggy-riding speed demon of the eponymous movie series — not a character one would wish to share a highway with.
CR’s findings indicate that giving Teslas the authority to make lane changes without driver participation — that is, without confirmation via the turn-signal stalk — may be premature. Its testers reported that their vehicles “often changed lanes in ways that a safe human driver would not — cutting too closely in front of other cars, and passing on the right.”
The magazine specifically challenged Tesla’s assertion that its vehicles’ three rear-facing cameras could detect fast-approaching objects from the rear better than the average driver. In practice, the system had trouble detecting vehicles approaching from behind at high speed: “Because of this, the system will often cut off a vehicle that is going a much faster speed since it doesn’t seem to sense the oncoming car until it’s relatively close.”
In several cases, CR says, the Teslas passed cars on the right on a two-lane divided highway. In Connecticut, where the testing took place, that’s illegal and would get the driver ticketed.
In short, while Tesla says its Navigate system aims to make highway driving “more relaxing, enjoyable and fun,” CR found that the lane-change function made the driving experience more stressful.
“Tesla is showing what not to do on the path toward self-driving cars,” said David Friedman, CR’s vice president of advocacy: “release increasingly automated driving systems that aren’t vetted properly.”
Your guide to our new economic reality.
Get our free business newsletter for insights and tips for getting by.
You may occasionally receive promotional content from the Los Angeles Times.