Advertisement

Op-Ed: Arizona’s revolt against self-driving cars should be a wake-up call to the companies that make them

Share via

People are fighting back against robot cars.

Self-driving cars being tested in Arizona by Waymo, a Google sister company, have been attacked by residents in at least 21 separate incidents, according to the Arizona Republic. People in the Chandler area have thrown rocks at the cars, slashed their tires and run them off the road. One man even pulled a gun on a Waymo test driver.

The attacks come less than a year after a pedestrian was killed by a self-driving Uber car in Tempe — the first pedestrian fatality involving an autonomous vehicle. Uber had moved its self-driving car trial to Arizona from San Francisco after California revoked the registration of the vehicles.

Like Arizonans, many Americans don’t trust that self-driving cars are safe. An April poll conducted by the Automobile Club of America found more than seven in 10 were too afraid to ride in a self-driving car, and a recent Gallup poll found that 52% of Americans said they would never want to use a driver-less car.

Advertisement

There are ethical quandaries with corporate algorithms making life-and-death decisions.

Carmakers assume they can overcome this distrust eventually. After all, farmers with pitchforks once chased early motorized vehicles and there was great public outcry about airplanes taking to the skies. The problem is carmakers have sped testing of the cars to public roads with little respect for public opinion.

Lawmakers from both parties have been largely on the carmakers’ side. Last year, a bipartisan coalition tried to slip legislation that would have overridden state safety standards into a federal spending bill. Only a handful of Senate Democrats, including Sen. Dianne Feinstein, objected to the so-called AV START Act.

Advertisement

“If self-driving cars are going to revolutionize transportation, states and cities shouldn’t be deprived of their ability to keep people safe during that transformation, especially since it could be decades before new federal safety standards are written,” Feinstein said at the time. Thankfully, the bill wasn’t passed.

The revolt against robot cars in Arizona shows that carmakers need to ask the public’s permission, not its forgiveness, before introducing autonomous vehicles onto public roads.

Boosters of highly automated vehicles, known as “HAVs,” argue that the cars can drive the blind and disabled, minimize human fault in accidents and save lives. Self-driving cars, they note, cannot be impaired by alcohol or drugs or distracted by incoming texts.

Advertisement

But these arguments don’t acknowledge any of the issues that come with putting artificial intelligence in the hands of profit-driven corporations who will control life-and-death decisions.

Many carmakers use a constant digital communication connection to control their cars. This increases cyber security risks, since connected devices can be hacked.

There are also ethical quandaries with corporate algorithms making life-and-death decisions. When a child chases a ball into the street, will the autonomous car be programmed to swerve, even if there is a chance that its passengers will be injured? Mercedes, for instance, has said its HAVs will be programmed to prioritize the lives of its occupants.

There is no good plan for how human drivers will share the road with autonomous cars. Many advocates consider separate roads the best solution, but the construction costs would be enormous.

Perhaps most importantly, according to our analysis of test results out of California, many self-driving cars are simply not ready to drive themselves. A number of them cannot go more than a few hundred miles without needing a human driver to take over. Some have difficulty handling pot holes, extreme weather, ambulances and bicyclists, among other things.

Uber moved its car trials to Arizona in part to avoid strict regulations in California. But the company, and others like it, will need to accept that regulation is the only way for humans to safely share the roads with autonomous cars.

Advertisement

The public shouldn’t be guinea pigs in this Silicon Valley experiment. We should set new rules for the road:

Humans’ ability to drive should not be restricted. Implicit in the sales pitch for autonomous cars is the notion that they will eventually be safer than human-controlled cars. Our right to drive should not be limited in the push to make way for self-driving vehicles.

Enter the Fray: First takes on the news of the minute from L.A. Times Opinion »

The programming needs to be transparent. Carmakers should be required to disclose all the ethical decisions reflected in their programming. Will their technology prioritize pedestrians? Or, like Mercedes, will it always protect the occupants of the vehicle? These decisions should be made through a public process, not behind closed doors.

Cyber security must be a priority. Researchers have established that some cars connected to the internet can be taken over. The industry must be required to use Pentagon-level security standards to safeguard the public.

When in doubt, the carmakers should be liable. Companies that make smart cars should be required to accept legal responsibility when their technology fails and not be permitted to foist the responsibility for such failures onto individuals, as Tesla recently tried to do when one of its Autopilot functions failed.

Advertisement

Arizona’s revolt ought to serve as a wake-up call to both Silicon Valley and Detroit: The public will not allow autonomous vehicles to take control of our roads.

Jamie Court is president of Consumer Watchdog, a nonprofit nonpartisan public interest group.

Follow the Opinion section on Twitter @latimesopinion or Facebook

Advertisement