Tesla asks owners to share fewer clips of beta flaws in ‘Full Self-Driving’

Tesla makes owners who opt for the controversial beta version of its “Full Self-Driving” software sign nondisclosure agreements and also discourages them from sharing video clips showing the driver assistance system making mistakes.

According to a copy of the NDA obtained by: SinTesla tells those who sign the document that “there are a lot of people who want Tesla to fail; Don’t let them mischaracterize your feedback and media posts.” The company also says owners in the beta should “share responsibly and selectively on social media” and “consider sharing fewer videos, and only those you think are interesting or worth sharing.”

Vices The report comes as Tesla now works to expand access to its “Full Self-Driving” software, while the National Highway Traffic Safety Administration investigates the company’s less sophisticated Autopilot driver assistance system currently available on its cars.

Tesla has been letting a small group of die-hard owners test the beta version of the “Full Self-Driving” software for about a year now. Some of them take their role as “beta testers” very seriously and try to find flaws in the system in an effort to help Tesla improve the software. Many also film themselves traveling around with the software running. Some compress their longer slices into supercuts, speeding up the images to emphasize how far the software can take them without human intervention. Others post the raw images, warts and all.

(As always, just to be clear, this software doesn’t make Tesla’s cars fully autonomous. Tesla CEO Elon Musk has even said himself that he believes the “feature complete” version of the software his company calls “Full Self-Driving” will best be alone”plausibleto drive someone from home to work without human intervention and will still need supervision. That doesn’t describe a fully autonomous car.)

This whole process — the years of unfulfilled claims to create fully autonomous cars, the idea of ​​testing developing driver-assistance software on public streets with untrained owners behind the wheel — has attracted a lot of attention to Musk and Tesla. Recently, however, a clip from a video originally shot by Tesla owner and investor Galileo Russell went viral and fueled the conversation even more.

In it, Russell’s car should insert on the left, but it’s… suddenly dives to the right, eventually pointing to pedestrians in a crosswalk. A hedge fund owner shared this clip on Twitter, where many people were (rightly) baffled at how close the car got to pedestrians.

In a follow-up video, Russell casually mentioned how Tesla “do not wantpeople in the beta share clips that look bad as they make a bigger point and explain why he posted the video in the first place. But it wasn’t until Sin reported in the NDA this week that it was clear what he meant.

Tesla is using this language to try to control the public perception of its “Full Self-Driving” software, as the company begins to open access to a much larger group – despite the software still in development. Tesla added a button to the user interface of its cars over the weekend that allows owners to request to be part of the beta. It also launched a “safety score” system, which checks and evaluates drivers who apply it on a number of metrics, such as braking behavior or aggressive acceleration.

Right now, Musk says drivers with a perfect safety score of 100 will be accepted into the beta, even though he tweeted that Tesla will lower that bar. He also said that Tesla will soon add up to 1,000 new owners per day to beta, a dramatic expansion of who can test the driver assistance software on public roads.

Expanding access is sure to bring even more attention to Tesla and Musk’s patchwork approach to deploying “Full Self-Driving” software. In fact, it already is. Last week, National Transportation Safety Board chair Jennifer Homendy told the: Wall Street Journal that she wished the company would address “basic safety issues” before admitting new owners into the program. Homendy was one of the more outspoken board members at a 2020 hearing that found Autopilot to be partially guilty of the 2018 death of a driver in Mountain View, California.

However, Tesla doesn’t seem to be changing course. Over the weekend, in response to a blog post about her comments, Musk tweeted a link to Homendy’s Wikipedia page – which eventually had to be locked after a sudden rush of edits.