The groups say the name ‘Autopilot’ creates confusion among consumers
By Sarah D. Young, CONSUMER AFFAIRS
July 29, 2019
The Center for Auto Safety and Consumer Watchdog have once again come together to lobby for a federal investigation into how Tesla markets its Autopilot driver-assistance system.
Last year, the consumer groups called on the Department of Motor Vehicles (DMV) and Federal Trade Commission (FTC) to investigate the matter, calling Tesla’s marketing of its semi-autonomous driving software “deceptive and misleading.”
Since then, “another person has died, others have been injured, and many more have acted recklessly as a result of Tesla giving owners the perception that a Tesla with ‘Autopilot’ is an autonomous vehicle capable of self-driving,” the groups said in a statement. “To be clear, it is not.”
The groups argue that Tesla’s representations of its Autopilot feature “continue to violate Section 5 of the FTC Act, as well as similar state statutes.”
“It is time for regulators to step in and put a stop to Tesla’s ongoing autopilot deception,” said Adam Scow, Senior Advocate for Consumer Watchdog. “Tesla has irresponsibly marketed its technology as safety enhancing.”
Misleading to consumers
For its part, Tesla has been clear that drivers should never take their hands off the wheel while Autopilot is engaged. However, a recent study found that many consumers still believe the feature equips the car with self-driving or autonomous capabilities.
“The name ‘Autopilot’ was associated with the highest likelihood that drivers believed a behavior was safe while in operation, for every behavior measured, compared with other system names,” the Insurance Institute for Highway Safety (IIHS) wrote.
“Autopilot also had substantially greater proportions of people who thought it would be safe to look at scenery, read a book, talk on a cell phone or text,” IIHS noted. “Six percent thought it would be OK to take a nap while using Autopilot, compared with 3 percent for the other systems.”
A factor in crashes
In March, a Tesla owner crashed his Model 3 into a semi truck. Investigators later found that the driver had turned on Autopilot about 10 seconds before the crash. The driver’s hands weren’t on the wheel and no “evasive maneuvers” had been executed prior to the collision.
The incident was “almost identical” to a fatal crash that occured in 2016, the groups noted. “In both instances, an overreliance on features that were deceptively described as an ‘Autopilot’ directly contributed to their deaths.”
Autopilot has also been a factor in at least two other deadly crashes. In 2018, Apple engineer Wei “Walter” Huang died in a crash while in his Model X with Autopilot activated. Huang’s hands weren’t detected on the wheel for six seconds prior to the collision. In January 2016, Gao Yaning in Handan, China was killed during a fatal collision involving a Tesla with Autopilot engaged.
The consumer groups pushing for an investigation have accused Tesla of “making their owners believe that a Tesla with ‘Autopilot’ is an autonomous vehicle capable of self-driving.”
Sarah D. Young has been a columnist for a blog aimed at Millennials and has also worked in early childhood education and has been a reading tutor to at-risk youth.