Tesla privately admits Elon Musk is exaggerating about ‘full self-driving’

0

Tesla CEO Elon Musk has exaggerated the capabilities of the company’s advanced driver assistance system, the company’s director of Autopilot software told the California Department of Motor Vehicles. The comments came from a memo released by the legal transparency group PlainSite, which obtained the documents from a request for public records.

It was the latest revelation about the ever-widening gap between what Musk says publicly about Autopilot and what Autopilot can actually do. And it coincides with Tesla coming to more attention after a Tesla vehicle crashed in Texas without anyone in the driver’s seat, killing two men.

“Elon’s tweet doesn’t match the technical reality per CJ. Tesla is currently at level 2, ”the California DMV said in a memo about the March 9 conference call with representatives of Tesla, including Autopilot software director CJ Moore. Level 2 technology refers to a semi-automatic driving system, which requires supervision by a human driver.

In a earnings call in January, Musk told investors he was “very confident that this year the car will be able to drive itself with reliability beyond man’s.” (It appears the DMV was referring to these January comments, which Moore misunderstood as a tweet from Musk.)

Last October, Tesla introduced a new product called “Full Self-Driving” (FSD) beta for vehicle owners in its Early Access Program. The update gave drivers access to Autopilot’s partially automated driver assistance system on city and local roads. The early access program is used as a testing platform to fix software errors. In the DMV memo, Tesla said there were 824 vehicles in the pilot program on March 9, including 753 employees and 71 non-employees.

Musk has said the company was “very careful” about the software update. Drivers are still expected to keep their hands on the wheel and be willing to take control of their Tesla at any time. But he has also made lofty predictions about Tesla’s ability to achieve full autonomy, which run counter to what his own engineers are telling regulators.

Tesla is unlikely to reach level 5 (L5) autonomy by the end of 2021, in which its cars can drive themselves anywhere and under any conditions without any human supervision, Tesla representatives told the DMV.

The ratio of interactions with the driver should be on the order of 1 or 2 million miles per interaction with the driver to move to higher levels of automation. Tesla indicated that Elon extrapolates from the degree of improvement when talking about L5 capabilities. Tesla could not say whether the pace of improvement would hit the L5 by the end of the calendar year.

This isn’t the first time Tesla’s private communications with the DMV have contradicted Musk’s public statements about his company’s autonomous capabilities. In March, PlainSite published communications last December between Tesla’s associate general counsel Eric Williams and California DMV’s autonomous vehicle branch chief Miguel Acosta. In it, Williams notes that “neither Autopilot nor FSD Capability is an autonomous system, and currently no function, individual or collective, is autonomous or makes our vehicles autonomous.” In other words, Tesla’s FSD beta is self-driving in name only.

(Al Prescott, associate general counsel at Tesla, was also involved in the December meeting with the DMV. Prescott has since left Tesla for LIDAR maker Luminar.)

Tesla and Musk have long been criticized for exaggerating the capabilities of the company’s Autopilot system, which in its most basic form can center a Tesla vehicle in a lane and around curves and adjust the car’s speed based on the vehicle in front. The use of brand names like Autopilot and FSD has also contributed to an environment where Tesla customers are misled into believing that their vehicles can actually drive themselves.

There have been a number of fatal accidents involving Tesla vehicles with Autopilot enabled. The latest took place in Spring, Texas, where two men were killed after their Tesla hit a tree. Local law enforcement officials said there was no one in the driver’s seat at the time of the crash, sparking speculation that the men were abusing the autopilot. Tesla later claimed that Autopilot was not in use at the time of the crash and that someone may also be in the driver’s seat.

The U.S. National Highway Traffic Safety Administration and the National Transportation Safety Board are both investigating the crash, along with dozens of other incidents involving Tesla Autopilot. Tesla did not respond to a request for comment, probably because the company has dissolved its news agency and typically no longer responds to requests from the media.