Array

Consumer Watchdog Calls On Elon Musk To Disable Tesla’s Autopilot; Says Company Must Pledge To Be Liable For Self-Driving Failures If Feature Returns

Published on

Santa Monica, CA – In the wake of a fatal crash in Florida Consumer Watchdog called on Tesla Chairman and CEO Elon Musk to disable his car’s “autopilot” feature until it is shown to be safe and said if it is offered in the future Tesla must pledge to be liable if something goes wrong with its self-driving technology.

Consumer Watchdog also expressed concern that Tesla was developing a pattern of blaming victims in crashes when the autopilot feature was engaged.

In a letter to Musk, signed by President Jamie Court, Executive Director Carmen Balber and Privacy Project Director John M. Simpson, the public interest group also criticized Tesla’s delay in revealing the fatal Florida accident.

“You made a public acknowledgement on June 30, only when the National Highway Traffic Administration announced it was investigating,” the letter said.  “Such a delay when your customers remained on public highways relying on autopilot and believing it to be safe is inexcusable.”

Read the letter here: http://www.consumerwatchdog.org/resources/ltrmusk070716.pdf

Consumer Watchdog said Tesla is trying to have it both ways when referring to the autopilot feature:

“On the one hand you extoll the supposed virtues of autopilot, creating the impression that, once engaged, it is self-sufficient.  Your customers are lulled into believing their car is safe to be left to do the driving itself. On the other hand you walk back any promise of safety, saying autopilot is still in Beta mode and drivers must pay attention all the time. The result of this disconnect between marketing hype and reality was the fatal crash in Florida, as well as other non-fatal crashes that have now come to light.”

“An autopilot whose sensors cannot distinguish between the side of a white truck and a bright sky simply is not ready to be deployed on public roads,” the letter said. “Tesla should immediately disable the autopilot feature on all your vehicles until it can be proven to be safe. At a minimum, autopilot must be disabled until the complete results of NHTSA’s investigation are released.”

One of the most troubling aspects of Tesla’s deployment of autopilot is the decision to make the feature available in so-called Beta mode, Consumer Watchdog said, an admission it’s not ready for prime time.

“Beta installations can make sense with software such as an email service; they are unconscionable with systems that can prove fatal when something goes wrong,” the letter said.  “You are in effect using your customers as human guinea pigs.”

Consumer Watchdog said that if autopilot is proven safe to deploy Tesla must assume liability for any crashes that occur when the feature is engaged. “Are you willing to make that pledge?” the letter asked.  “In response to the tragic fatal crash Tesla said in a blog, ‘We would like to extend our deepest sympathies to his family and friends.’  Do you accept responsibility for the crash?”

Both Volvo and Mercedes have said they will accept liability when their self-driving technology is responsible for a crash.

A July 6 Tesla blog post discussing the fatal Florida crash underscored the company’s pattern of blaming others. The post said:

“To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic.”

A July 1 crash in Pennsylvania in which the Model X rolled over and the occupants fortunately survived, is another example of blaming others, Consumer Watchdog said. Tesla is not willing to assume responsibility when autopilot fails. Tesla’s response to the Pennsylvania crash: “Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.”

And, according to the Wall Street Journal, when a Tesla in autopilot crashed into a parked truck in Virginia the company said the crash “was the result of driver error…. To an attentive driver, it would have been clear that the driver should immediately slow the vehicle to avoid the accident.”

Consumer Watchdog’s letter concluded:

“Tesla is rushing self-driving technologies to the highways prematurely, however, as the crashes demonstrate, autopilot isn’t safe and you should disable it immediately. If autopilot can ultimately be shown to meet safety standards and is then redeployed, you must pledge to be liable if anything goes wrong when the self-driving system is engaged.”

-30-

 

John M. Simpson
John M. Simpson
John M. Simpson is an American consumer rights advocate and former journalist. Since 2005, he has worked for Consumer Watchdog, a nonpartisan nonprofit public interest group, as the lead researcher on Inside Google, the group's effort to educate the public about Google's dominance over the internet and the need for greater online privacy.

Latest Videos

Latest Articles

In The News

Latest Report

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More articles