Self-Driving Cars Keep Rolling Despite Uber Crash
By Ryan Beene and Alan Levin, BLOOMBERG
March 21, 2018
The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation’s roadways.
Don’t count on it.
Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration’s aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.
“Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation,” said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who’s now director of cars and product policy for Consumers Union.
In January, the safety administration began soliciting comment from industry and the public on how to rewrite auto safety rules — many from decades ago — to accommodate autonomous vehicles.
Transportation Secretary Elaine Chao’s policy guidance for self-driving vehicle safety, released in September, used the word “voluntary” 57 times.
Supporters say the technology could help reduce the 35,000 highway deaths in the U.S. each year, the vast majority of which, research shows, are caused by human error. Safety advocates have criticized the approach as too light, and lacking oversight of a technology that is still mostly in the development phase.
“They’ve decided to sit this one out and simply let the industry develop the technology as the industry wishes,” said Peter Kurdock, director of regulatory affairs at Advocates for Highway and Auto Safety. “When you have no federal oversight, unfortunately, we are concerned that we are going to see more deaths as a result of this technology being placed on the roads before this technology has been properly tested and vetted.”
In a statement, NHTSA declined to discuss the Uber crash and how it may affect policy. The agency said its Special Crash Investigation team has been dispatched to Tempe. The National Transportation Safety Board has also sent a team to investigate.
After a Tesla Inc. car struck the side of a truck while operating autonomously in 2016, the NTSB found that the design of the system — which allowed drivers to keep their hands off the wheel for prolonged periods — contributed to the crash, in which the driver died.
Robust Safeguards Sought
As part of its inquiry, the NTSB issued seven recommendations to U.S. regulators and the auto manufacturing industry seeking more robust safeguards and better data to be used during crash investigations.
So far, no rules have been changed.
In a Feb. 7 response to the transportation safety board by NHTSA Deputy Administrator Heidi King, the agency said it preferred “voluntary guidance” instead of mandatory new requirements.
Legislation championed by self-driving car developers, including General Motors Co. and Alphabet Inc.’s Waymo unit, sailed through the House with unanimous support last year. A similar bill in the Senate advanced out of committee in but has since stalled amid calls for stronger oversight, cybersecurity and safety protections by some Democrats.
Sunday’s crash amplified those calls.
“This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” Senator Richard Blumenthal, a Connecticut Democrat, said at a hearing Tuesday on defective automobile air bags.
“My hope is that we will take a lesson from the experience with airbags and their defects with the more complex technology of autonomous driving vehicles and look carefully, prepare meticulously, take care of safety before we leap into an unknown future technology,” he said.
Emil Frankel, a former assistant secretary at the Department of Transportation, said the accident is unlikely to reshape the government’s policy debate. But it highlights the broader risk of undermining the public’s acceptance of self-driving, particularly if investigators find fault with the technology, he said.
If public opinion gels against self-driving vehicles, it could harm business plans, tamp down investment and slow consumer acceptance of the technology.
“That’s what Uber and, frankly, everyone else involved in these vehicles is going to be very concerned about,” said Frankel, who is now a senior fellow at the ENO Center for Transportation in Washington.
At least some of the response by government depends on the outcome of the multiple investigations currently underway.
If, as the Tempe police chief suggested in an interview published Monday, the pedestrian, Elaine Herzberg, 49, suddenly moved in front of the Uber car and a collision couldn’t be avoided, that makes it easier for the autonomous industry to move forward.
On the other hand, if faults with the car’s sensors or computer logic emerge, that could change the debate. In such a case, the NTSB, which is independent from other U.S. regulatory agencies, may hold a public hearing or take other steps to highlight safety shortfalls. That and a public outcry could increase pressure for more regulation and a more cautious approach.