Washington — The U.S. government will not order any recalls or fines as it concludes its investigation of a fatal crash involving a 2015 Tesla Model S that was operating with its automated driving system activated.
The National Highway Traffic Safety Administration said Thursday that it did not find any safety defects in Tesla’s semi-autonomous “Autopilot” system in the car that was involved in the May 7, 2016, crash, which was believed to be the first U.S. death in a vehicle engaged in a semi-autonomous driving feature. But the agency warned that with the current level of technology, drivers must be ready to take over in an emergency – and that automakers must make drivers aware of the limitations.
NHTSA said its examination did not identify any defects in the design or performance of the automatic emergency braking or Autopilot systems. It said automatic emergency braking systems used in the automotive industry through model year 2016 are “rear-end collision avoidance technologies that are not designed to reliably perform in all crash modes, including crossing-path collisions.” The agency said in documents that were posted on its website Thursday, “The Autopilot system is an advanced driver-assistance system that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.”
Tesla CEO Elon Musk rushed to highlight NHTSA’s conclusion that there was no safety defect. “Report highlight: ‘The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation,’” Musk tweeted shortly after the NHTSA announcement.
NHTSA’s decision to spare Tesla of the harsh penalities that have been doled out to other automakers for safety violations in recent months brings to close a months-long investigation that roiled the debate about the future of self-driving cars.
The company faced questions about its Autopilot system after 40-year-old Joshua Brown was killed in the crash in Williston, Florida. His 2015 Model S was operating with the driver-assist system engaged.
Brown’s Tesla collided with a semi-trailer that turned left in front of the car, undetected by the vehicle’s Autopilot feature. Radars on the Tesla car could not distinguish the side of the white truck from the sky. Florida police said the roof of the car struck the underside of the trailer and the car passed beneath. Brown was declared dead at the scene.
“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a blog posting on June 30 after the NHTSA investigation was launched.
Karl Brauer, executive analyst for Autotrader and Kelley Blue Book, said the road to the autonomous car is going to be messy, with computers gradually increasing their ability to replace humans over the next 10 years: “The Tesla fatality is an example of a human driver assigning too much capability to the car’s sensors and computing power. It almost certainly won’t be the last incident on this journey, but people need to remember one thing – there are currently no fully autonomous cars available for public purchase or use. The advanced driver-assist systems found on many of today’s vehicles still require the full attention of the driver at all times. A technology-driven injury or fatality is possible, and even likely, before we achieve widespread use of self-driving cars, but NHTSA’s ruling suggests that wasn’t the case in this instance.”
Consumer groups seized upon the deadly crash to argue that Tesla is rushing self-driving cars to market, and they urged federal regulators to put the brakes on the company.
John Simpson, privacy project director at the Santa Monica, California-based Consumer Watchdog group, said regulators are “blaming the human” in their finding of no safety defect in Tesla’s Autopilot system.
“NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the technology and Tesla’s aggressive marketing,” Simpson said in an email. “The very name ‘Autopilot’ creates the impression that a Tesla can drive itself. It can’t. Some people who apparently believed Tesla’s hype got killed. Tesla CEO Elon Musk should be held accountable.”
Tesla, meanwhile, has defended its Autopilot system, and the company has maintained that demand for its vehicles is still strong.
NHTSA spokesman Bryan Thomas said Thursday that the federal probe of the Tesla crash was “thorough.”
The agency’s Office of Defects Investigation said the Florida fatal crash appears to “a period of extended distraction” that lasted at least 7 seconds.
“An attentive driver has superior situational awareness in most of these types of events, particularly when coupled with the ability of an experienced driver to anticipate the actions of other drivers,” the report continued. “Tesla has changed its driver monitoring strategy to promote driver attention to the driving environment.”
Thomas noted Thursday that Tesla updated its Autopilot software in September 2016 to give drivers more frequent warnings and institute a “three-strikes” rule that disables the Autopilot feature after a driver fails three times to be responsive to distraction warnings.
Thomas said Tesla’s upgrades were not related to the federal probe, although he said regulators were happy with the outcome. “There can be continuing safety improvements, and we always encourage that,” he said. “But that doesn’t mean we found a defect on May 7.”
Thomas added that federal regulators hope auto companies will continue to give drivers adequate warnings about the limits of semi-self driving features like Tesla’s autopilot.
“Generally speaking, our view is that the manual should be clear about the limitations of the vehicle,” he said. “But we strongly believe it’s not enough to put it in the manual and hope drivers are going to read it.”