The Messenger – Tesla Faces Growing Legal Challenges Over Autopilot Amid Massive Recall

By William Gavin, THE MESSENGER

https://themessenger.com/business/tesla-faces-growing-legal-challenges-over-autopilot-amid-massive-recall

Before Tesla issued a recall covering more than 2 million electric vehicles over software issues, at least a dozen lawsuits had been filed against the automaker in the U.S. over its Autopilot system, according to The Wall Street Journal.


Several of those lawsuits are filed by individuals who were involved in crashes that allegedly involved Autopilot, while others are brought forward by the families of people who died in crashes that they say were related to the driver-assistance system. Consumers and investors have also sued Tesla over its marketing of Autopilot and its Full Self-Driving, alleging the company has engaged in deceptive and misleading practices.

Autopilot is Tesla’s driver-assistance system and comes standard on the company’s new EVs; the technology is designed to help drivers with steering and maintaining a safe distance from other vehicles while on the highway. Full self-driving is essentially an upgrade, designed to identify stop signs and traffic lights and operate on city streets.

Multiple cases, including one over a 2019 fatal crash near Miami involving a Tesla Model 3, equipped with Autopilot, are set to go to trial next year.


So far, Tesla has largely prevailed in legal battles over allegations involving Autopilot; in April, Tesla won a trial in Los Angeles by emphasizing that, despite its name, Autopilot does require an attentive human driver. The carmaker in October won its first U.S. trial over allegations that Autopilot contributed to a fatal car crash after it convinced a jury that the crash was caused by human error, not a manufacturing defect.

Tesla’s marketing and promotion of its driving-assistance software has been criticized for years by both federal officials and consumer groups alike.


In promotional materials and court filings, Tesla tells drivers that Autopilot doesn’t make its cars autonomous, meaning a driver must be in control of the vehicle at all times. However, Tesla CEO Elon Musk has repeatedly said that his company’s vehicles were close to operating independently, and the company has posted videos online demonstrating its cars operating without human interference.


“Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is,” The Center for Auto Safety and Consumer Watchdog wrote to federal regulators in May 2018, following two separate car crashes involving a Tesla driver. Both drivers had enabled Autopilot in their vehicles at the time.


The California Department of Motor Vehicles has been investigating Tesla’s Autopilot since May 2021 and recently filed a motion accusing Tesla of misleading customers, according to the Los Angeles Times. Among other examples, the DMV points to the claim that Autopilot “is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.”


Neither Autopilot nor Full Self-Driving is capable of conducting a full trip without driver interaction.


Since 2016, the National Highway Traffic and Safety Administration has opened more than three dozen special crash investigations into Tesla’s vehicles over safety issues, Reuters reported in March. The regulator in July asked the automaker to provide information related to a then-recent Autopilot update and for fresh data related to the software as part of its investigation.


Tesla has also been the subject of recent investigations launched by California’s attorney general over false advertising and safety issues and the Department of Justice requested data from the automaker related to its Autopilot software last year.


In November, Palm Beach County Circuit Court Judge Reid Scott — presiding over the case involving a 2019 fatal crash north of Miami — found “reasonable evidence” that Tesla executives knew its driver-assistance technology was defective and pointed to Musk’s statements about the technology, The Guardian reported. In that case, the owner of a Model 3, Stephen Banner, was killed when Autopilot allegedly failed to brake or avoid a collision with a semitrailer truck.


“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.

Latest Videos

Latest Articles

In The News

Latest Report

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More articles