By Keith Barry, CONSUMER REPORTS
August 18, 2021
Two U.S. senators have called for the Federal Trade Commission to open an investigation into whether Tesla has engaged in deceptive marketing practices regarding the capabilities of its Autopilot and Full Self-Driving (FSD) driver assistance systems.
In a letter to FTC Chair Lina Khan, Sens. Richard Blumenthal, D-Conn., and Edward Markey, D-Mass., said they “fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public,” and that “it is clear that drivers take Tesla’s statements about their vehicles’ capabilities at face value and suffer grave consequences.” According to the letter, at least 11 people have died in the U.S. while they were driving Teslas with Autopilot activated since the feature became available in 2015.
The letter follows multiple requests from safety groups and other government agencies for the FTC to examine Tesla’s marketing claims starting in 2018. Despite their names, neither Autopilot nor FSD can make a Tesla fully autonomous. Instead, the features are designed to take over certain steering, braking, and acceleration functions, and they require an attentive driver to remain in control of the vehicle at all times.
The FTC is made up of five commissioners, one of whom is chair. The agency is responsible for consumer protection, including preventing deceptive marketing practices, stopping fraud, and enforcing antitrust laws.
To support their request, Blumenthal and Markey provided a list of examples of where they say Tesla overstated what the vehicles can do or will soon be able to do, including Tesla marketing materials and public statements from CEO Elon Musk. These include a 2019 video posted on Tesla’s YouTube channel titled “Full Self-Driving’’ that shows a Tesla driving without human intervention, and multiple statements from Musk that promise Tesla vehicles will soon be fully autonomous.
An FTC representative told CR that the agency has received the Senators’ letter, but the agency has no further comment. Tesla did not respond to CR’s request for comment.
Blumenthal and Markey’s request may fall on more sympathetic ears this time, says William Wallace, manager of safety policy at Consumer Reports.
“The FTC’s current leaders are especially outspoken about consumer protection,” Wallace says, noting that Khan was sworn in as chair in June and has signaled that the agency will step up its enforcement activity. “It’s well-documented that these systems can lead to overreliance and put people at risk—especially in the absence of safeguards to make sure the driver is looking at the road.”
Autopilot can automate some steering, braking, and acceleration tasks. FSD includes features that can assist the driver with parking, changing lanes on the highway, and coming to a complete halt at traffic lights and stop signs. FSD software is currently in a beta—or trial—release. CR has tested versions of both technologies and found that they are far from self-driving and still require human intervention to operate. Although Autopilot outperformed most similar systems from other manufacturers when it came to keeping a car in its lane and matching the speed of surrounding traffic, we found that GM’s Super Cruise feature did a better job of ensuring that drivers stayed engaged during use. FSD, on the other hand, had advertised features that did not work reliably in our evaluations.
Jason Levine, executive director of the Center for Auto Safety, a nonprofit consumer advocacy organization, told CR in an e-mail that he hopes the FTC takes action to “investigate dangerous ongoing consumer deception by Tesla.”
Along with Consumer Watchdog, another safety advocacy group, the Center for Auto Safety already requested in 2018 that the FTC investigate Tesla’s marketing practices, arguing that the way Tesla advertised Autopilot was “likely to mislead consumers into reasonably believing that their vehicles have self-driving or autonomous capabilities.” Consumer Reports also has pressed the FTC to take enforcement action against any car company that markets a driving system as autonomous when it actually requires supervision and engagement by a human driver.
In 2019, the National Highway Traffic Safety Administration asked the FTC to investigate claims Tesla had made in blog posts about the Model 3’s safety in a crash, according to documents made public by PlainSite, a legal documentation advocacy group. In Germany, Tesla has been banned from making certain promises about Autopilot in its advertising materials since a 2020 court ruling.
Earlier this week, NHTSA opened an investigation into 11 confirmed crashes, 17 injuries, and one death involving Tesla vehicles crashing into parked emergency vehicles after Autopilot has been active.
In May, the California Department of Motor Vehicles put Tesla “under review” for public statements that may violate state regulations that prohibit automakers from advertising vehicles for sale or lease as autonomous unless the vehicle meets the statutory and regulatory definition of an autonomous vehicle and the company holds a deployment permit.
As of May 2021, NHTSA had initiated 28 special investigations into crashes involving Tesla vehicles with advanced driver assistance systems, including Autopilot and Full Self-Driving Capability.