Feds tweak driverless-car guidelines, seek to balance safety and tech development

Transportation Secretary Elaine Chao.
(Carolyn Kaster / AP)

Driverless cars and trucks will be hitting the highways in increasing numbers over the next few years. The U.S. Department of Transportation doesn’t want to get in the way.

That’s the message in a new set of guidelines the department released Wednesday. The intent is to spur further development while emphasizing safety, the department said.

“Safety is always No. 1 at the U.S. Department of Transportation,” department Secretary Elaine Chao said in a short speech at CES, the big consumer electronics show in Las Vegas, where the new guidelines were announced. But “remaining technology neutral” is a department commitment and “protecting American innovation and creativity” is another top priority.

On March 18, a robot-driven Volvo operated by Uber hit and killed a pedestrian in Arizona.

April 30, 2018


Chao also noted the department has proposed rules requiring “remote IDs” for drones weighing more than half a pound. That would allow the FAA, law enforcement and federal security agencies to identify drones flying in their jurisdiction, she said.

Recent news reports out of Colorado and Nebraska of mystery drones flying in formations at night is a timely illustration of why remote IDs are needed,” she said. The public comment period for the new drone rules extends through March.

Federal driverless vehicle guidelines have been issued on a roughly annual basis since 2016, with a strong emphasis on “voluntary guidance.” The federal government sets safety standards and the states are in charge of licensing. In California the Department of Motor Vehicles has established rules for driverless deployment that include insurance requirements as well as requirements that local safety officials be informed when robot cars are operating in their area.

Some safety advocates say regulators haven’t caught up with the technology. But rather than push new regulations, the Transportation Department has been issuing suggestions and encouraging cooperation on a uniform approach to driverless technology development among federal, state and local government officials and industry.

The biggest change in the new set of guidelines, called Automated Vehicles 4.0, is a streamlined system of federal oversight. Without offering specifics, Chao said the new guidelines “unified AV efforts across 38 federal departments, independent agencies, commissions and executive offices of the president.”

Driverless car technology is developing more slowly than Silicon Valley companies were predicting several years ago. Deployment plans have been delayed, including GM Cruise’s original intent to have robotaxis operating commercially on the streets of San Francisco by now.


But development inches forward. Waymo, the driverless car offshoot of Google, is already operating a commercial robotaxi service in the Phoenix area and is offering driverless rides to employees and guests on public roads in Silicon Valley. Waymo wants to offer a small-scale robotaxi service there, but so far, the state’s Public Utilities Commission, which regulates ride-hailing services, won’t allow it to charge for rides.

Companies such as Beep and Voyage are experimenting with driverless shuttles in retirement communities in Florida. Ford is testing a robotaxi service in Miami. Waymo, TuSimple and Starsky Robotics are operating driverless trucks on public highways in Arizona and Florida.

One reason that development has slowed: the March 2018 tragedy in Arizona in which a woman walking a bicycle across a highway was struck and killed by a driverless Uber vehicle when the safety backup driver at the wheel did not react in time.

Although crashes and deaths are inevitable whether humans or robots are driving motor vehicles, most manufacturers say they’re striving to be as responsible about deployment as possible. A driverless-vehicle industry and consumer coalition called PAVE was formed last year to educate the public and policymakers on driverless technology and to address safety concerns.

Some driverless technology advocates assert that current systems already are safer than humans, but statistics don’t yet exist to prove it. About 40,000 people were killed because of roadway crashes in 2018, with 95% of those caused by human error, according to the Transportation Department.

With Tesla’s Smart Summon feature, the car drives itself to its human owner. It’s already being abused, raising public safety questions.

Oct. 4, 2019

“Realizing the vast potential of AVs will require collaboration and information sharing” among industry, government and institutions involved in auto safety, Chao said.

But the sharing and publicizing of data will remain a hot-button policy issue. “Protecting intellectual property” of the tech companies is key to innovation, Chao said. But how to separate proprietary data from safety data, and whether to enforce corporate disclosure of safety data, has drawn little public discussion.

The new Transportation Department guidelines focus on development of driverless cars, not cars equipped with technology that allows some robotic capabilities but still requires a human driver’s attention. Such “Level 2” automation is already commonplace, with driver-assist options such as adaptive cruise control, blind spot detection and lane-keeping offered by all auto manufacturers. Some Level 2 systems, such as Tesla’s Autopilot and Cadillac’s SuperCruise, allow automatic lane changing.

Safety groups and politicians such as Sen. Edward J. Markey (D-Mass.) have expressed concern that Tesla is confusing drivers and potential customers by referring to some of its optional Level 2 technologies as “Full Self Driving” with videos of Chief Executive Elon Musk on the highway in a Tesla with his hands in the air.

The National Highway Traffic Safety Administration, part of the Transportation Department, is investigating several fatal Tesla crashes that involved Autopilot. The agency is looking at a crash that killed two people in a Honda Civic on Dec. 29 in Gardena, although whether Autopilot was involved in that crash has not yet been determined.