Uber Crash Sparks Talk Of Tighter Rules For Self-Driving Cars
By Carolyn Said, THE SAN FRANCISCO CHRONICLE
March 31, 2018
U.S. lawmakers have applied a light touch in regulating robot cars.
At the national level, the Trump administration has proclaimed that driverless-car guidelines should be “entirely voluntary” for automakers, and bills pending in Congress would clear the way to putting tens of thousands of autonomous cars on the road — orders of magnitude more than the few hundred in the country today — before federal safety regulations are set. Forty states have passed various rules, with California’s being the most well defined, but the pending federal laws would void states’ ability to regulate driverless cars.
All that may change soon.
The industry’s first pedestrian fatality, in which a self-driving Uber SUV hit and killed Elaine Herzberg in Tempe, Ariz., last month, has focused attention on the nascent industry and its safety rules — or lack thereof.
“The one positive thing that may come from Ms. Herzberg’s death is that regulators at all levels will start to ask the questions they should have asked before (automated vehicles) were tested in public,” said Jim McPherson, a Benicia attorney who runs SafeSelfDrive to consult on driverless cars.
Consumer advocates have long warned that lax regulations play fast and loose with public safety.
“It’s crazy that we’re letting these things on the road right now, using you and me as human guinea pigs, and letting companies use public roads as private laboratories,” said John Simpson from Consumer Watchdog, which has called for a nationwide moratorium on public autonomous testing until there’s a report on the Arizona crash. “We’re getting too far ahead of ourselves.”
The industry counters that self-driving cars — which don’t text, drink or get distracted — could end the nation’s 40,000 annual traffic fatalities, making it a moral imperative to get them on the roads sooner than later.
Tesla, which said Friday that its Autopilot system was engaged in a deadly crash last month on Highway 101, struck a similar tone in defending the driver-assistance technology. The company said drivers using it were nearly four times less likely to have an accident. Though driver-assistance features aren’t the same as driverless tech, incidents involving them may also shape public opinion.
A study from the Rand Corp. says widespread use of autonomous cars before they’re perfected would save lives, even if they’d still cause crashes, injuries and fatalities.
Testing autonomy on public roads doesn’t mean using the public as lab rats, the argument goes. It is a way to bring this potentially life-saving technology to the public more quickly, the industry says.
Uber’s Lidar system and front cameras on top of the Volvo XC90 outside the Uber Advanced Technologies Group headquarters at Pier 70 in San Francisco. Uber has halted testing in S.F. since the fatality in Arizona last month.
“It’s a catch-22,” said Stephen Beck, founder of management consulting firm CG42. “For the technology to grow, learn and get better, you have to put it in real-world situations. There’s only so far you can go in testing environments” such as closed courses and simulations.
Uber suspended its autonomous-car testing program after the accident, in which a backup driver was at the wheel but appeared to be looking down briefly prior to the accident. Most other developers have continued trials on public roads, generally with backup drivers.
In Washington, lawmakers expressed concerns after the fatality, which could lead to stricter oversight.
“This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” Sen. Richard Blumenthal, D-Conn., said in a statement.
The Uber crash “underscores the need to adopt laws and policies tailored for self-driving vehicles,” Sen. John Thune, R-S.D., co-sponsor of a pending bill in the Senate, said in a statement. “Congress should act to update rules, direct manufacturers to address safety requirements and enhance the technical expertise of regulators.”
With human-driven cars, the federal government regulates vehicle design and safety, while states regulate vehicle operations and operators — issuing car registrations and driver’s licenses, for instance.
But autonomous vehicles contain both the car and the driver in one robot package. “That blurs the lines between those state-federal responsibilities,” said Bryant Walker Smith, a law professor at the University of South Carolina who studies autonomous-car rule-making.
The problem, he and others said, is that the pending federal laws quash states’ abilities to address autonomous-vehicle safety concerns — even though it could take two years for the National Highway Traffic Safety Administration to hammer out federal safety standards for autonomous vehicles.
Both a bill passed by the House and one pending in the Senate “are insufficiently protective of safety,” said Sarah Light, an assistant professor of legal studies and business ethics at the Wharton School, who studies the subject. “They would immediately preempt state safety rules before there are federal safety standards for autonomous driving systems. That would create a safety gap.”
Smith thinks someone should be required to attest that the cars are safe, whether it’s a company, a fleet owner or a manufacturer.
His thoughts on how to beef up federal regulations: Establish mandatory and substantial safety-evaluation reports and make them publicly available. Give the NHTSA more resources to assess manufacturers’ claims and authority to act on those reports — pressing for more information, preventing companies from deploying if their claims aren’t credible, and enabling the creation of tests and standards for autonomous vehicles. And don’t stop states from crafting their own rules, at least for now.
“If states want to exercise more authority, let them,” he said. “If one state doesn’t want autonomy, let them. That may be worse for their citizens, but people don’t want to feel this is being forced on them.”
Simpson, from Consumer Watchdog, has some simple advice for regulations.
“You have to pass an eye exam and a driving test before you can get a driver’s license,” he said. “We should have analogous requirements for autonomous vehicles that they can sense and differentiate objects. Can they distinguish between overhanging tree branches and a human waving their arms, for instance?”
California, the epicenter of self-driving car testing because of the intense interest in it among Silicon Valley companies, has crafted more detailed rules than any other state. These include requirements that all autonomous cars register with the state, and that companies report on crashes and on how often humans must take over from the machine.
The Golden State has recently added rules that allow carmakers to apply to test cars with no drivers — such testing could start later this spring — and eventually to carry paying passengers. Companies must certify that their cars are ready, and promise that remote operators are available to steer the vehicles around obstacles such as construction.
Other states, such as Arizona and Florida, already allow no-driver cars and have few, if any, rules for them.
“California’s rules are more comprehensive and more protective of safety than the federal rules,” Light said. “It should be allowed to say how it wants to protect its citizens.”
The U.S. has long trusted automakers to guarantee their cars’ safety and it hasn’t always ended well.
“Transportation safety regulations and the rule-making process in the U.S. is very reactive, slow and usually, unfortunately, written in blood of past victims,” said Najmedin Meshkati, a professor of civil engineering at USC.
Federal autonomous vehicle bills
(Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution) Act, HB3388
Asks the National Highway Traffic Safety Administration to figure out what features need specific performance standards within one year and to require carmakers to explain within two years how they’re handling safety for autonomous vehicles. Empowers the NHTSA to grant exemptions to Federal Motor Vehicle Safety Standards, since cars without steering wheels, accelerators or brake pedals do not meet those standards; each carmaker could apply for up to 25,000 vehicles the first year, ratcheting up to 100,000 each within four years.
Asks car companies to write privacy plans regarding the use of customer data.
Prevents states from regulating autonomous vehicle design and operation, even if there isn’t a federal standard for them. Lets states maintain oversight of licensing, insurance and public-safety transportation laws.
Status: Passed by House of Representatives, September 2017. Would need reconciliation with Senate bill before becoming law.
(American Vision for Safer Transportation through Advancement of Revolutionary Technologies) Act, SB1885
Asks the Department of Transportation to review federal safety standards to pinpoint which ones might create barriers for autonomous vehicles, such as rules requiring steering wheels, and to suggest alternative language within six months. Gives the DOT one year to update regulations.
Allows for safety standard exemptions of up to 100,000 cars per manufacturer within several years.
Requires autonomous vehicle makers to submit safety evaluation reports at least 90 days before making the vehicles commercially available. Creates a technical advisory body and working group on consumer education. Asks carmakers to develop cybersecurity plans.
Says states can’t write stricter safety regulations than the federal ones (which haven’t been written yet). Says states cannot regulate autonomous vehicles for safety, data recording, cybersecurity, human-machine interface, crash-worthiness, post-crash behavior and automation function.
Status: Passed by Commerce Committee; awaiting vote by full Senate. Even before the Uber self-driving fatality, several senators said they thought the bill needed to be stricter.