Home Tech Tesla Autopilot Was Exceptionally Risky and May Remain So

Tesla Autopilot Was Exceptionally Risky and May Remain So

0 comment
Tesla Autopilot Was Exceptionally Risky and May Remain So

A federal report released today found that Tesla’s Autopilot system was involved in at least 13 fatal accidents in which drivers misused the system in ways that the automaker should have foreseen and done more to prevent. Not only that, but the report called Tesla an “industry outlier” because its driver-assist features lacked some of the basic precautions taken by its competitors. Now regulators are wondering whether an update to Tesla’s Autopilot designed to fix these basic design problems and prevent fatal incidents has gone far enough.

These fatal accidents killed 14 people and injured 49, according to data collected and published by the National Highway Traffic Safety Administration, the federal traffic safety regulator in the US.

At least half of the 109 “head-on” crashes closely examined by government engineers (those in which a Tesla collided with a vehicle or obstacle directly in its path) involved visible hazards five seconds or more before impact. That’s enough time that an attentive driver could have prevented or at least avoided the worst of the impact, government engineers concluded.

In one of those accidents, a March 2023 incident in North Carolina, a Model Y traveling at highway speed struck a teenager while exiting a school bus. The teenager was airlifted to a hospital to treat his serious injuries. NHTSA concluded that “both the bus and the pedestrian would have been visible to an attentive driver and would have allowed him to avoid or minimize the severity of this accident.”

Government engineers wrote that throughout their investigation, they “observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver.”

Tesla, which dissolved its public affairs department in 2021, did not respond to a request for comment.

Unfortunately, the report called Tesla “an industry outlier” in its approach to automated driving systems. Unlike other car companies, the report says, Tesla allowed Autopilot to work in situations it was not designed for and failed to pair it with a driver engagement system that required its users to pay attention to the road.

Regulators concluded that even the Autopilot product name was a problem, encouraging drivers to trust the system rather than collaborate with it. Automotive competitors often use “assist,” “sensing” or “equipment” language, according to the report, specifically because these systems are not designed to drive completely on their own.

Last year, California state regulators accused Tesla of false advertising its Autopilot and Full Self-Driving systems, alleging that Tesla misled consumers into believing the cars could drive themselves. in a presentationTesla said the state’s failure to oppose the Autopilot brand for years constituted implicit approval of the automaker’s advertising strategy.

The NHTSA investigation also found that, compared to competing products, Autopilot resisted when drivers attempted to steer their vehicles themselves; a design, according to the agency. wrote in its summary of a nearly two-year investigation into Autopilot, which discourages drivers from engaging in driving tasks.

A new autopilot probe

These accidents occurred before Tesla retired and updated its Autopilot software via an over-the-air update earlier this year. But in addition to closing this investigation, regulators have He also opened a new investigation into whether Tesla’s updates, pushed in February, did enough to prevent drivers from misusing Autopilot, not understanding when the feature was actually in use, or using it in places where it is not designed to work.

The review comes after a Washington state driver last week said his Tesla Model S was on Autopilot (while using his phone) when the vehicle struck and killed a motorcyclist.

You may also like