Self-Driving Cars Raise Legal Issues

Published on

Self-Driving Cars Raise Legal Issues

Crash in San Francisco has led motorcyclist to sue General Motors. Collisions with standard vehicles called inevitable

By Ethan Baron, DAYTON DAILY NEWS (OHIO)

March 9, 2018

Mary Barra, chairman and CEO of General Motors, talks with workers and members of the media at the Orion Assembly Plant in Orion Township, Mich., last summer, announcing that the company completed production of 130 Chevrolet Bolt EV test vehicles.

General Motors is in a race to be the first company to mass produce self-driving cars, but a recent crash with a San Francisco motorcyclist has illustrated the challenge of assigning blame when an autonomous vehicle gets in an accident.

As self-driving cars take to the roads in increasing numbers, collisions with standard vehicles are inevitable, experts say, as are lawsuits.

San Francisco commercial photographer Oscar Nilsson sued GM, after a Dec. 7 collision with a Chevrolet Bolt that aborted a lane change while driving autonomously.

The crash highlights an important issue raised by autonomous technology: Self-driving vehicles may not behave like those driven by humans, and that may complicate investigations into who’s at fault.

“That’s going to continue to be a huge area where we’re going to have problems,” said John Simpson, spokesman for nonprofit Consumer Watchdog, a frequent critic of speedy deployment of autonomous vehicles.

GM’s subsidiary, Cruise, has since August been testing a self-driving car service in San Francisco with human backup drivers behind the wheel, as required by the state.

Nilsson’s lawsuit claims he was riding on his motorcycle behind one of GM’s autonomous Bolts on Oak Street when the car, with its backup driver, changed lanes to the left. When he rode forward, the Bolt suddenly veered back into his lane and knocked him to the ground, according to the lawsuit, filed in U.S. District Court in San Francisco.

The San Francisco Police Department’s report on the incident blamed Nilsson for passing a vehicle on the right when it wasn’t safe, but Nilsson’s lawyer, Sergei Lemberg, disputed that finding.

“I don’t know what a police officer can tell, after the fact,” Lem-berg said. “I don’t know that it’s fair to blame this act on the completely innocent person who’s just driving down the road and gets hit.”

The police report, said Lem-berg, supported holding GM responsible. It noted that after the Bolt determined it couldn’t make the lane change, and began moving back while Nilsson was passing on the right, the Bolt’s backup driver tried to grab the wheel and steer away, but the collision occurred simultaneously.

“Why don’t these folks just take some responsibility?” Lem-berg said.

A crash report filed with the California Department of Motor Vehicles by GM provided a much different view of the accident. The company acknowledged that the car, in autonomous-driving mode in heavy traffic, had aborted a lane change. But GM said that as its car was “re-centering itself” in the lane, Nilsson, who had been riding between two lanes in a legal-in-California practice known as lane-splitting, “moved into the center lane, glanced the side of the Cruise … wobbled, and fell over.”

In an email statement, GM noted that the police report concluded Nilsson was responsible for the accident.

“Safety is our primary focus when it comes to developing and testing our self-driving technology,” GM said.

The company has been running a “Cruise Anywhere” program since August for employees, which allows them to hail automated Cruise vehicles and be driven anywhere in San Francisco. It was unclear whether the vehicle involved in the accident was part of this program.

It was also unclear if the Bolt in question was one of the “third-generation” automated vehicles described last fall by Cruise CEO Kyle Vogt as “the world’s first mass-producible car designed to operate without a driver.” Those vehicles were intended to be used in the “Cruise Anywhere” program, Vogt wrote in a Medium post.

Companies operating autonomous vehicles are likely to settle quickly in crash-related lawsuits when the technology appears to be at fault, and fight mightily when they believe the driver of the ordinary vehicle to be responsible, said Stanford researcher and University of South Carolina School of Law professor Bryant Walker Smith.

“There might be data that might tend to show fault or no fault,” Smith said.

Such data – which may include video and other driving information from the autonomous vehicle – could help investigators. Simpson of Consumer Watchdog said it should be publicly disclosed whenever a self-driving car crashes.

GM’s testing in San Francisco highlights the firm’s progress in the race to mass-produce autonomous cars. While Google took an early lead in autonomous driving with a program now spun off into its own company, called Waymo, GM’s manufacturing capabilities and other advantages have allowed it to catch up, according to a report by market-research firm Navigant Research in January.

Autonomous-vehicle firms competing for success in a new market are bound to face legal thickets when accidents happen.

In the San Francisco accident, GM’s crash report said the Bolt was traveling at 12 mph, while Nilsson had been driving at 17 mph. After the collision, Nilsson “got up and walked his vehicle to the side of the road” and “reported shoulder pain and was taken to receive medical care.”

Nilsson claimed in his lawsuit that he suffered neck and shoulder injuries, which will require “lengthy treatment,” and that he had to go on disability leave. He’s seeking unspecified damages.

Consumer Watchdog
Consumer Watchdoghttps://consumerwatchdog.org
Providing an effective voice for American consumers in an era when special interests dominate public discourse, government and politics. Non-partisan.

Latest Videos

Latest Releases

In The News

Latest Report

Support Consumer Watchdog

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More Releases