The Nissan Leaf is cruising at 35 mph when a pedestrian jumps into the roadway.
But there’s no one at the controls. Instead, radar, lasers and cameras recognize the pedestrian — actually a dummy shoved into the road by an engineer. Computers order the car to slam the brakes and swerve, avoiding a collision.
The recent demonstration, at a former military base in Irvine, underscored just how far automakers have come in developing cars that drive themselves. Car companies including Nissan, General Motors and Mercedes have logged thousands of miles of successful tests, with an eye toward selling autonomous vehicles by 2020.
Nissan’s test provided a vivid display of what’s already possible: The Leaf dropped an occupant at the “store,” then proceeded to drive itself down a parking row, stop for an SUV driven by a human, and back into a space.
But the technology is just one of many challenges. Convincing consumers, regulators, insurers and lawyers that autonomous vehicles are safe — and determining who pays when they crash — could wrap their future in a Gordian knot.
“It is uncharted waters,” said James Yukevich, a Los Angeles attorney who defends the auto industry from product liability lawsuits. “I don’t think this is an area very many people have thought much about.”
Coddled by robotic chauffeurs, would people retain the driving skills to take over in emergencies? Who would be liable if an autopiloted car runs through a crowd of pedestrians: the owner or the automaker? Would insurance premiums go up or down? Would cyberterrorists figure out how to make Fords blast through school zones at 100 mph?
Are human drivers really ready to give up control?
Such thorny questions cast doubt on automakers’ ambitious timelines, said Bryan Reimer, a scientist and transportation expert at MIT.
“Humans can deal relatively well with humans making mistakes, but we don’t deal as well with robots making mistakes,” Reimer said. “How many of us are willing to get on an airplane with no pilot — even though half the time the pilots are just sitting around watching the automation?”
It may seem inevitable that machines will one day pilot cars more safely than humans. But that will have to be proved beyond doubt before legislators and regulators give them free rein. The engineering will have to be fail-safe, said David Strickland, administrator of the National Highway Traffic Safety Administration.
For now, just three states — California, Florida and Nevada — allow self-driving cars on the road, and only for testing. Six states have rejected testing, and seven others are considering regulations, according to the Center for Internet and Society at Stanford University.
California has directed its Department of Motor Vehicles to craft regulations by the start of 2015, said Bernard Soriano, deputy director for the agency. The state is working with the NHTSA, the highway patrol and the state’s departments of insurance and transportation to figure out the regulations.
Automakers are moving in stages toward fully autonomous cars. They started more than a decade ago with features such as electronic stability control, which assists with braking to help drivers control the car.
Some 2014 Mercedes-Benz and Acura models combine adaptive cruise control, which keeps a vehicle at a safe distance from cars ahead, with lane keeping, which automatically adjusts steering. Another system slams the brakes before an impending crash.
The next level of development: cars that assume full control under favorable traffic or weather conditions. These are the self-drivers Nissan expects to sell by 2020, said Maarten Sierhuis, director of the automaker’s Silicon Valley research center. The NHTSA is working on a four-year timetable to issue regulations for such cars.
All this leads to the final frontier: vehicles that can operate completely on autopilot — even without passengers, as in automated taxis or delivery vans.
The challenge for automakers will be programming cars to navigate complicated and unexpected conditions, Sierhuis said.
“It is always the outliers — the very complex traffic situations that are hard to imagine and hard to test — that are difficult,” he said.
Sierhuis plans to test the autonomous Leaf at a section of road near his Sunnyvale, Calif., office that he calls “the monster.” Human drivers cause an accident a week there, a high-volume road with a highway crossing and three intersections in quick succession.
As it formulates rules, California’s DMV is looking for guidance from regulations governing robotic surgery and the potential use of commercial aerial drones, which are seeing limited use for photography in Canada.
Beyond regulatory hurdles, car companies could be taking on a huge liability in selling robotic cars.
“If the driver has no control of the vehicle at all, how is it possible for that person to be negligent?” Yukevich said. “You can sit there and read the newspaper. If there is an accident, you can’t be at fault.”
Liability questions surrounding automated cars may be simpler than they appear, said Ryan Calo, a professor who teaches robotics law at the University of Washington. If automakers build every system in the car, then presumably the automaker will be on the hook in lawsuits over accidents or injuries — just as human drivers are now.
Moreover, by their nature, the cars would come with an array of sensors that feed into a black-box data recorder, making it easier to unravel whether man or machine is to blame in crashes. Nonetheless, lawyers and judges will no doubt shape the law surrounding self-driving cars — after they hit the road.
“The legal issues don’t need to be 100% worked out from Day 1,” Calo said. “I do think there will be enough structure in place so that this won’t be the Wild West.”
Less complicated is the issue of insurance. Already, automated driving technology on today’s cars is reducing property and injury claims. That could mean lower premiums as automation advances, insurance officials say.
Although cars could be the first mass-marketed robots, the insurance industry has covered automated machines in many sectors.
“A modern airliner is in effect an autonomous vehicle,” said Robert Hartwig, president of the Insurance Information Institute.
Consumers remain skeptical. Only 18% said they would buy autonomous vehicles, and 33% said they would feel safe on roads filled with them, according to a survey of 1,000 adults conducted in September for the Chubb Group of Insurance Companies.
In time, Sierhuis believes, people will take to autonomous cars just as eagerly as they have other technologies.
“People were surprised when we landed on Mars,” he said. “But now we have robots walking around on the surface, and no one thinks twice about it.”
As with many technologies we now take for granted, autopiloted cars have a long history in science fiction. “Sally,” an Isaac Asimov short story first published in 1953, portrays a futuristic world where people no longer drive themselves.
“I can remember when there wasn’t an automobile in the world with brains enough to find its own way home,” says the main character, Jacob Folkers, who runs a retirement home for intelligent vehicles. “I chauffeured dead lumps of machines that needed a man’s hand at their controls every minute. Every year, machines like that used to kill tens of thousands of people.”