U.S. auto safety regulators said Friday that their investigation into Tesla’s Autopilot had identified at least 13 fatal crashes in which the feature had been involved. The investigation also found that the electric car maker’s claims did not match reality.
The National Highway Traffic Safety Administration (NHTSA) revealed on Friday that during its three-year Autopilot safety investigation, which began in August 2021, it identified at least 13 Tesla crashes involving one or more deaths, and many more with serious injuries, in which “the driver’s foreseeable misuse of the system played an apparent role.”
It also found evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operational capabilities,” resulting in a “critical safety gap.”
NHTSA also expressed concern that Tesla’s Autopilot name “may lead drivers to believe that automation has greater capabilities than it does and invite drivers to over-rely on automation.”
Tesla said in December that the largest recall in its history, covering 2.03 million U.S. vehicles (or nearly all of its vehicles on U.S. roads), was aimed at ensuring drivers pay attention when using its advanced driver assistance system. driver.
After closing the first investigation, regulators opened another, this time to determine whether that recall to install new Autopilot safeguards was appropriate.
NHTSA said it was opening the second investigation after identifying concerns due to crashes after the vehicles had the recall software update installed “and the results of NHTSA’s preliminary testing of the repaired vehicles.”
That recall investigation covers Models Y,
The agency said Tesla has issued software updates to address issues that appear to be related to its concerns, but has not included them “in part of the recall nor has it decided to remedy a defect that poses an unreasonable safety risk.” NHTSA also cited Tesla’s statement “that a portion of the remedy requires the owner to opt-in and allows the driver to easily reverse it.”
Tesla said in December that Autopilot software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.
Tesla did not immediately respond to a request for comment.
In February, Consumer Reports, a nonprofit that evaluates products and services, said its testing of Tesla’s Autopilot recall update found that the changes did not adequately address many safety concerns raised by NHTSA and urged the agency to require the automaker to take “stronger action,” stating that Tesla’s recall “addresses minor inconveniences rather than fixing real problems.”
Tesla’s Autopilot is intended to allow cars to automatically turn, accelerate and brake within their lane, while Enhanced Autopilot can help change lanes on highways, but does not make vehicles autonomous.
One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.
Tesla said in December that it disagreed with NHTSA’s analysis but would implement an over-the-air software update that will “incorporate additional controls and alerts to those already in place in affected vehicles to further encourage the driver to comply with their responsibility.” continuous driving”. whenever auto-spin is activated.”
Then-NHTSA top official Ann Carlson said in December that the investigation determined more needed to be done to ensure drivers are active when Autopilot is in use. “One of the things we determined is that drivers don’t always pay attention when that system is activated,” Carlson said.
NHTSA opened its investigation into Autopilot in August 2021 after identifying more than a dozen crashes in which Tesla vehicles collided with stationary emergency vehicles.
Furthermore, since 2016, NHTSA has opened more than 40 special Tesla accident investigations in cases where driving systems such as Autopilot were suspected, and 23 accident deaths have been reported to date.
Tesla’s recall includes an increased emphasis on visual alerts and disabling Autosteer if drivers do not respond to inattention warnings and additional controls when activating Autosteer. Tesla said it would restrict the use of Autopilot for a week if significant misuse is detected.
Tesla revealed in October that the US Department of Justice issued subpoenas related to its full self-driving (FSD) and Autopilot features. Reuters reported in October 2022 that Tesla was under criminal investigation.
In February 2023, Tesla recalled 362,000 US vehicles to update its FSD beta software after NHTSA said the vehicles did not adequately comply with traffic safety laws and could cause accidents.