Tesla sued by Texas police after Model X hit five officers on autopilot

A group of Texas law enforcement officers are suing Tesla after a Model X with Autopilot collided with five police officers. The suit was First reported by KPRC 2 in Houston.

It’s the latest legal headache for the automaker as it looks to roll out its controversial driver assistance software to more customers. And it comes as Tesla is re-examined over several crashes that have occurred with Autopilot and emergency vehicles.

The crash happened on February 27, 2021 in Splendora, a small town in Montgomery County in the eastern part of the state. According to the lawsuit, the Model X SUV collided with several police officers while doing a traffic stop on the Eastex Freeway in Texas. “All were seriously injured,” the lawsuit says.

The plaintiffs allege that “design and manufacturing defects known to Tesla” are responsible for the crash, as well as “Tesla’s reluctance to admit or correct such defects.” The autopilot, they argue, “failed to detect the officers’ cars or function in any way to avoid or warn the danger and the ensuing crash.”

The plaintiffs also note that “this was not an isolated case,” citing “at least” 12 other crashes involving a Tesla vehicle using Autopilot. Incidentally, the National Highway Traffic Safety Administration is investigating 12 accidents in which Tesla owners using the company’s Autopilot features crashed into stationary emergency vehicles, injuring 17 and killing one.

The lawsuit cites several tweets from Tesla CEO Elon Musk commenting on Autopilot crashes or incidents of Tesla owners abusing the system as evidence that the company is aware of these defects and has not recalled or corrected them.

Tesla’s blatant refusal to take additional security measures or fix the issues with its Autopilot system demonstrates a lack of oversight and oversight of Tesla’s Autopilot system. Tesla has made a conscious decision not to fix these issues and must be held accountable and accountable, especially when it has detailed knowledge of the risks and dangers associated with its Autopilot system.

The officers are also suing a local restaurant owner, alleging that the Model X driver had drunk too much alcohol prior to the incident. They are demanding compensation for injuries and permanent disabilities. The lawsuit lists damages in excess of $1,000,000, with a maximum damages of $20,000,000.

Tesla has been hit by lawsuits over Autopilot crashes in the past. In 2019, Tesla was sued by the family of Jeremy Banner, a 50-year-old man who died in a crash while using Autopilot. Earlier that year, the company was sued by the family of 38-year-old Wei Huang, who died in 2018 after his Autopilot-enabled Model X crashed into one of the driveway.

Last week, Tesla gave access to the beta of its “Full Self-Driving” (FSD) program to more customers via a “request” button on Tesla’s dashboard screens. FSD is being marketed as a more advanced version of Autopilot that allows drivers to use its features, such as steering wheel controls and adaptive cruise control, on local roads.

Security officials have criticized the rollout. Jennifer Homendy, chair of the National Transportation Safety Board, said last week that Tesla must address “basic safety issues” before expanding FSD, calling the company’s use of the term fully self-driving “misleading and irresponsible.”