California disengagement reports give several reasons for when safety drivers needed to take over.

By Keith Shaw, ROBOTICS BUSINESS REVIEW

February 13, 2019

https://www.roboticsbusinessreview.com/unmanned/consumer-group-says-sel…

In the same week in which self-driving companies received some major investments, a consumer advocacy group said autonomous vehicle technology “is not ready to operate without a human who can take control of the car” on public roads.

Consumer Watchdog based its statements on new reports that are required by California’s Department of Motor Vehicles for companies that test its autonomous vehicles on the state’s public roads. The so-called disengagement reports show how many times a human driver needed to take control of the car during its testing on the public roads. While companies such as Waymo and GM’s Cruise division were able to drive thousands of miles before intervention was needed, companies like Mercedes-Benz and Uber “needed human intervention at least once per every mile driven,” the group said.

“Despite all of the hype and promises, these reports show that robot cars aren’t safe without human drivers ready to take over,” said Adam Scow, a senior advocate for Consumer Watchdog. “While some companies are gradually improving, others are crawling out of the gates. Much more testing and improvement is needed before regulators can consider approving driverless cars for our roads.”

Miles driven show improvement

In 2018, self-driving cars were tested for more than 2 million miles on California public roads, a big increase from the 500,000 miles driven in 2017. Waymo, formerly Google’s autonomous vehicle unit, logged the majority of those miles, with approximately 1.25 million miles. It reported a test driver took control 76 times, or once every 16,447 miles. The failure rate is significantly better than the 2017 period, when Waymo’s cars drove 352,544 miles on California’s roads and reported 63 disengagements, or one every 5,596 miles. Last October, Waymo became the only company to receive a permit to test without a human driver in the vehicle (click here for Waymo’s disengagement report).

In 2018, General Motors’ Cruise division, which previously claimed it would put robot cars on the road in 2019, drove 447,621 miles in San Francisco and had 86 human interventions, or one every 5,205 miles (click here for the Cruise disengagement report).

Reports from Uber and Mercedes-Benz showed much higher rates of intervention, the advocacy group said. Uber reported 70,165 interventions for only 26,899 autonomous miles tested, or 2.6 human interventions per mile driven.  Mercedes reported 1,194 interventions for only 1,749 miles tested or 1.46 interventions per mile driven.

Details about the interventions include precaution, location, software and perception problems arising from a variety of scenarios.

Precaution, planning, mapping

Most of the reasons given for the disengagement were precautionary, or part of the planned testing that was done. For example, some of the descriptions for the reasons for the disengagement included:

  • “Safety Driver took control to ensure proper distance between parked vehicle partially blocking lane of travel.”
  • “Automatic disengagement caused by perception fault.”
  • “Safety Driver decided extra braking should be applied in order to maintain gap in front of car.”
  • “Disengage for a recklessly behaving road user.”
  • “Disengage for incorrect behavior prediction of other traffic participants.”
  • “Disengage for a perception discrepancy for which a component of the vehicle’s perception system failed to detect an object correctly.”
  • “Disengage for adverse weather conditions experienced during testing.”
  • “Other road user behaving poorly.”
  • “Precautionary takeover to address planning.”
  • “The vehicle’s estimate of location drifted out of the current travel lane while stopped. When resuming motion, it attempted to correct for this, turning into an incorrect lane. The test driver took control.”
  • “The map geometry for a merge lane was incorrect, causing the vehicle to temporarily plan to enter the incorrect lane. The test driver took control upon observing the plan.”
  • “Disengage due to operator discomfort.”

In addition to human intervention, state reports showed an increase in the amount of crashes involving the robot cars, which were reported to the DMV and posted on its website. Companies reported 75 collisions in 2018, compared to 29 reports in 2017. Cruise reported 22 crashes in 2017 and 36 in 2018.

While 62 companies are licensed to test autonomous vehicles in California, only those companies that tested on public roads reported disengagement numbers for 2018. Tesla claimed it tests on public roads around the world, but did not report any tests in California.

Consumer Watchdog praised the Department of Motor Vehicles for requiring and posting the disengagement reports and the crash reports.  Other states where testing is being done, including Arizona, Washington, Michigan and Pennsylvania, have no such disclosure requirement.

“Besides the occasional tragedy, the public is in the dark about what’s happening in other states,” Scow said. “It’s only because of California’s rules that the public can find out what’s happening when companies use public roads as their private laboratories. The next step is to require that companies testing robot cars that are involved in a crash should be required to make public video and technical data about the incident.”

To access the California DMV Disengagement Reports by company for 2018, click here.

Keith Shaw is the Editor-in-chief for Robotics Business Review. Prior to joining EH Media, he worked as an editor for Network World, Computerworld and various newspapers across Massachusetts, New York, and Florida. He holds a degree in journalism from Syracuse University.