Witness Says Self-Driving Uber Ran Red Light On Its Own, Disputing Uber’s Claims

Published on

Company insists traffic violations in San Francisco are the result of ‘human error’ by drivers who can take control if needed, but witness account contradicts this

An autonomous Uber malfunctioned while in “self-driving mode” and caused a near collision in San Francisco, according to a business owner whose account raises new safety concerns about the unregulated technology launch.

The self-driving car – which Uber introduced without permits, as part of a testing program that California has deemed illegal – accelerated into an intersection while the light was still red and while the automation technology was clearly controlling the car, said Christopher Koff, owner of local cafe AK Subs.

“It looked like the car ran the red light on its own,” Koff, 49, said of the self-driving Uber Volvo, which has a driver in the front seat who can take control when needed. Another car that had the green light had to “slam the brakes” to avoid a crash, he said.

Koff’s story, which advocacy group Consumer Watchdog shared with state officials on Tuesday, directly contradicts Uber’s public claims that red-light violations have been the result of “human error” and that the drivers, not the technology, have failed to follow traffic laws.

The new allegations – which Uber denied and which cover an incident three weeks ago – have come to light days after the corporation openly refused to adhere to California regulations, claiming that its defiance of government was an “issue of principle”.

Uber’s autonomous cars were first spotted on San Francisco streets in September, but the company formally launched a pilot program to riders last week. California officials have repeatedly said the ride-sharing corporation, which is headquartered in San Francisco, needs testing permits, noting that 20 other companies have followed protocols.

But Uber has ignored attorney general Kamala Harris’s threat of legal action, claiming it does not need permits since the vehicles have drivers monitoring and citing the cars’ “state-of-the-art” technology and “core safety capabilities”.

Koff’s account, however, suggests that the products may not be ready for the road and that safety mechanisms are insufficient.

It was around 5am local time, and Koff said he was standing 10ft away from the vehicle when he saw it stopped at a light. While the driver was talking to a passenger, who had a laptop out, the car suddenly drove forward into the red, according to Koff. The driver’s hands were not on the wheel, he added.

“He was not driving. It was in self-driving mode,” said Koff. He noted that it was foggy at the time and that there were construction trucks nearby shining yellow lights that could have possibly interfered with the technology.

It would not be the first time the computer in a self-driving vehicle made a basic error with potentially life-threatening consequences.

In May, the “autopilot sensors” on a Tesla Motors car failed to distinguish a white tractor-trailer crossing the highway against a bright sky, leading to the first known death caused by a self-driving car.

Uber also admitted to the Guardian on Monday that its San Francisco cars have a “problem” with the way they cross bike lanes, and the company’s self-driving cars in Pittsburgh have reportedly collided with other cars and driven the wrong way on a street.

Spokeswoman Chelsea Kohler declined to provide details about Koff’s claims and sent the Guardian a statement identical to the one she provided last week, citing “human error”, adding: “This is why we believe so much in making the roads safer by building self-driving Ubers.”

Kohler did not respond to questions about how the company knows the driver was at fault and whether he faced consequences. Last week, she said two drivers had been suspended after the self-driving vehicles had been recorded running red lights.

Critics have argued that regardless of whether violations occur in self-driving mode or while a human is in control, Uber needs to be responsible for dangers posed by its cars – and should be embracing regulators, not shunning them.

“Someone could be hurt or maimed or paralyzed for the rest of their life because we’re trying to rush something out there,” said Koff, noting that he also recently saw a driver in an autonomous Uber scramble to take control when it was trying to navigate around a nearby bus and an approaching ambulance.

John M Simpson, privacy project director for Consumer Watchdog, who filed a report based on Koff’s incident, said he suspects Uber does not want to follow regulations that would require it to disclose details about errors to the government.

“Being able to understand the traffic signal and respond appropriately is a key requirement of any so-called self-driving technology,” said Simpson, who has called for criminal charges against Uber. “It obviously failed that test.”

Latest Videos

Latest Releases

In The News

Latest Report

Support Consumer Watchdog

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More Releases