Blame Game: Self-Driving Car Crash Highlights Tricky Legal Question
By Ethan Baron, MERCURY NEWS
January 23, 2018
General Motors is in a race to be the first company to mass produce self-driving cars, but a recent crash with a San Francisco motorcyclist has illustrated the tricky challenge of assigning blame when an autonomous vehicle gets in an accident.
As self-driving cars take to the roads in increasing numbers, collisions with standard vehicles are inevitable, experts say, as are lawsuits.
San Francisco commercial photographer Oscar Nilsson sued GM on Monday, after a Dec. 7 collision with a Chevrolet Bolt that aborted a lane change while driving autonomously.
The crash highlights an important issue raised by autonomous technology: Self-driving vehicles may not behave like those driven by humans, and that may complicate investigations into who’s at fault.
“That’s going to continue to be a huge area where we’re going to have problems,” said John Simpson, spokesman for nonprofit Consumer Watchdog, a frequent critic of speedy deployment of autonomous vehicles.
GM’s subsidiary, Cruise, has since August been testing a self-driving car service in San Francisco with human back-up drivers behind the wheel, as required by the state.
Nilsson’s lawsuit claims he was riding behind one of GM’s autonomous Bolts on Oak Street, when the car, with its back-up driver, changed lanes to the left. When he rode forward, the Bolt suddenly veered back into his lane and knocked him to the ground, according to the lawsuit, filed in U.S. District Court in San Francisco.
The San Francisco Police Department’s report on the incident blamed Nilsson for passing a vehicle on the right when it wasn’t safe, but Nilsson’s lawyer Sergei Lemberg disputed that finding.
“I don’t know what a police officer can tell, after the fact,” Lemberg said Tuesday. “I don’t know that it’s fair to blame this act on the completely innocent person who’s just driving down the road and gets hit.”
The police report, said Lemberg, actually supported holding GM responsible. It noted that after the Bolt determined it couldn’t make the lane change, and began moving back while Nilsson was passing on the right, the Bolt’s backup driver tried to grab the wheel and steer away, but the collision occurred simultaneously.
“Why don’t these folks just take some responsibility?” Lemberg said.
A crash report filed with the California Department of Motor Vehicles by GM provided a much different view of the accident. The company acknowledged that the car, in autonomous-driving mode in heavy traffic, had aborted a lane change. But GM said that as its car was “re-centering itself” in the lane, Nilsson, who had been riding between two lanes in a legal-in-California practice known as lane-splitting, “moved into the center lane, glanced the side of the Cruise … wobbled, and fell over.”
In an emailed statement Tuesday, GM noted that the police report concluded Nilsson was responsible for the accident.
“Safety is our primary focus when it comes to developing and testing our self-driving technology,” GM said.
The firm has been running a “Cruise Anywhere” program since August for employees, which allows them to hail automated Cruise vehicles and be driven anywhere in San Francisco. It was unclear whether the vehicle involved in the accident was part of this program.
It was also unclear if the Bolt in question was one of the “third-generation” automated vehicles described last fall by Cruise CEO Kyle Vogt as “the world’s first mass-producible car designed to operate without a driver.” Those vehicles were intended to be used in the “Cruise Anywhere” program, Vogt wrote in a Medium post.
Companies operating autonomous vehicles are likely to settle quickly in crash-related lawsuits when the technology appears to be at fault, and fight mightily when they believe the driver of the ordinary vehicle to be responsible, said Stanford researcher and University of South Carolina School of Law professor Bryant Walker Smith.
“There might be data that might tend to show fault or no fault,” Smith said.
Such data — which may include video and other driving information from the autonomous vehicle — could aid investigators. Simpson of Consumer Watchdog said it should be publicly disclosed whenever a self-driving car crashes.
GM’s testing in San Francisco highlights the firm’s progress in the race to mass produce self-driving cars. While Google took an early lead in autonomous driving with a program now spun off into its own company, called Waymo, GM’s manufacturing capabilities and other advantages have allowed it to catch up, according to a report by market-research firm Navigant Research in January.
Autonomous-vehicle firms competing for success in a new market are bound to face legal thickets when accidents happen.
In the San Francisco accident, GM’s crash report said the Bolt was traveling at 12 mph, while Nilsson had been driving at 17 mph. After the collision, Nilsson “got up and walked his vehicle to the side of the road” and “reported shoulder pain and was taken to receive medical care.”
Nilsson claimed in his lawsuit that he suffered neck and shoulder injuries, which will require “lengthy treatment,” and that he had to go on disability leave. He’s seeking unspecified damages.
In California, autonomous vehicle test drivers must have good driving records and successfully complete a test-driver training program administered by the carmaker, DMV spokeswoman Jessica Gonzalez said.
Self-driving test cars could turn up almost anywhere on California’s roads.
“Companies with permits are allowed to test on any California public roadway — they don’t tell us which ones they are testing on,” Gonzalez said.
General Motors and its Cruise subsidiary have had a permit to test autonomous vehicles on California roads since June 2015, and have 110 vehicles and 300 test drivers approved for testing, according to the DMV.
Ethan Baron is a business reporter at The Mercury News, and a native of Silicon Valley before it was Silicon Valley. Baron has worked as a reporter, columnist, editor and photographer in newspapers and magazines for 25 years, covering business, politics, social issues, crime, the environment, outdoor sports, war and humanitarian crises.