Look What’s Making Self-Driving Cars Freak Out

Published on

Look What’s Making Self-Driving Cars Freak Out

New report details all the triggers that have caused the human behind the wheel to take over

By Patrick May, BAY AREA NEWSGROUP

February 27, 2018

Look what’s making self-driving cars freak out

Ready . . . set . . . drive (driverlessly)!

It’s almost here: drive time with nobody behind the wheel. That will soon become legal on California roads for testing and eventually transporting the public for commercial purposes.

And the Golden State’s brave new foray into autonomous vehicles starts in just 34 days.

Here’s what’s different this time: Driverless cars have been allowed on public roads for testing purposes since the fall of 2014, but rules required that a human safety monitor be seated behind the wheel, just in case.

Starting April 2, that human will be at a remote location, not physically in the driver’s seat but able to virtually take over the steering when and if things go south. At some point in the future, that requirement is expected to go away.

Not everyone’s thrilled with this fast-moving trend to test out driverless cars on our Bay Area roadways. As my colleague Ethan Baron points out, California’s move was immediately attacked by Consumer Watchdog, which said the “disengagement reports” companies file with the DMV when human backup drivers have to take over show the technology isn’t ready for remote control.

“Operation of the vehicles from afar would transform the testing of autonomous cars into ‘a deadly video game that threatens highway safety,’ the consumer advocacy group said.”

So what do these “disengagement reports” show? We decided to have a look.

First, from Car and Driver magazine, a caveat about these statistics that purportedly show how many close calls various car companies have had with their autonomous fleets. As writer Pete Bigelow explained in a story earlier this month, “the annual reports on autonomous testing in California required by the state’s Department of Motor Vehicles are far from a perfect measure of any company’s self-driving competence.”

While they do provide a few new details on what led to the self-driving system being shut off by a concerned human operator, the reports provide a somewhat flawed picture when trying to compare one company’s track record with another’s.

The problem, says Bigelow, is that it’s apples and oranges: “One company’s low number of disengagements may occur during testing on empty highways, while another company’s high number may have occurred during testing in busy urban areas.”

Also, comparisons are misleading “because some companies place more value on testing in real-world scenarios while others put more emphasis on simulation, and sometimes engineers might be purposely disengaging to validate their systems.”

Here’s a look at the latest disengagement reports, which run from 2016 to late 2017. There are 20 companies listed, though some did not offer any new data. Honda, for example, did not test any vehicles on California roads in 2017, so obviously it had no problems to report.

The reports are difficult to parse for several reasons. Different companies are testing different models of autonomous vehicles and over different time periods. So it’s hard to compare one outfit with another. Baidu, for example, tested four models. It had a total of 48 disengagements during the 1,971 miles those four cars drove between October 2016 and November 2017.

But compare that with, say, Alphabet’s Waymo, which reported 63 incidents over the same time period. While Baidu’s cars logged 1,971 miles, Waymo’s vehicles covered a whopping 352,545 miles in autonomous mode. How in the world can you accurately compare these two operators’ performance?

For the casual viewer, the reports are more interesting, perhaps, when you look instead at the reasons behind the disengagements. These details offer an intriguing look at this cutting-edge technology – and what sorts of challenges the engineers behind it are facing.

Consider these recurring issues experienced by many of these companies’ vehicles:

  • Disengage for a recklessly behaving road user
  • Disengage for hardware discrepancy
  • Disengage for unwanted maneuver of the vehicle
  • Disengage for a perception discrepancy
  • Disengage for incorrect behavior prediction of other traffic participants
  • Heavy pedestrian traffic
  • Cyclist
  • Traffic light detection
  • Construction
  • Localization divergence
  • Poor lane markings
  • Vehicle cut in
  • Cyclist riding from sidewalk into crosswalk
  • Vehicle in cross-traffic ran red light
  • Software crash
  • Unexpected steering due to path change
  • Strong unexpected braking

Finally, the reports provide a window into how the companies are learning from their ”disengagements:”

Waymo: “To help evaluate the safety significance of disengagements, Waymo employs a powerful simulator program. In Waymo’s simulation, our team can “replay” each incident and predict the behavior of our self-driving car if the driver had not taken control of it, as well as the behavior and positions of other road users in the vicinity (such as pedestrians, cyclists, and other vehicles). Our engineers use this data to refine and improve the software to ensure the self-driving car performs safely.”

Drive.ai gave itself a stellar report card when reporting on its disengagements:

  • As of 30 November 2017, we have seven vehicles licensed for autonomous operation in California.
  • Total autonomous miles: 6,572. We drove 557 autonomous miles in 2016 and 6,015
    autonomous miles in 2017.
  • Total disengagements: We had a total of 151 disengagements where either a failure of the
    autonomous technology was detected or safe operation of the vehicle required that the test
    driver take manual control.
  • Miles driven per disengagement (MPD): We have improved from 3 MPD in August 2016 to
    110 MPD in November 2017.
  • Most importantly, we are pleased to report that we completed our first reporting period
    without any collisions or safety incidents of any kind.

GM Cruise remarked in its report to regulators that it was using San Francisco as a test lab for good reason:

“Last year we drove over 125,000 miles on San Francisco’s complex city streets,” wrote Albert Boniske, GM Cruise’ Director of Product Integrity. “All the attached data is from this urban driving. We drive in San Francisco because it allows us to improve more quickly. Cities like San Francisco contain significantly more people, cars, and cyclists that our self-driving vehicles must be aware of at any given time.

“That makes San Francisco one of the hardest places to test a self-driving vehicle, and creates a rich environment for testing our object detection, prediction, and response functions. It also helps us validate our vehicles’ self-driving skills faster than testing in a suburban location alone. So, we drive here because by doing so we get better faster.”

Zoox, a robotics company based in Menlo Park, also talked about San Francisco in glowing terms when it comes to putting autonomous vehicles through their paces. (Its CEO, Tim Kentley-Klayhas, has referred to the city as  a “black diamond” ski slope for such things). In its report, Zoox also offered a self-serving marketing pitch while providing some very cool details on the number of bikes and people their vehicles has come across during the past year or so.

“In San Francisco, every year about 30 people lose their lives and over 200 more are seriously injured while traveling on city streets, as a result of these human factors. Autonomous mobility offers an opportunity to save lives and prevent injuries and crashes on our roadways.

Cities are complex and dynamic. Zoox is creating an autonomous system with novel vehicles designed to safely share the road with pedestrians, cyclists, public transit, emergency vehicles, and other road users.

“On the streets of San Francisco, our test system often sees more in 100 feet than a vehicle might experience over 100 miles on a freeway. During a recent 30-minute drive in downtown San Francisco, our system detected 503 pedestrians, 188 bicyclists and 2,741 cars.”

Finally, NVIDIA in its report included this rather reassuring detail:

“The average period of time for the driver to assume manual control of the vehicle was (less than) one second.”

Consumer Watchdog
Consumer Watchdoghttps://consumerwatchdog.org
Providing an effective voice for American consumers in an era when special interests dominate public discourse, government and politics. Non-partisan.

Latest Videos

Latest Releases

In The News

Latest Report

Support Consumer Watchdog

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More Releases