Take a fresh look at your lifestyle.

Israeli scientists fool Tesla autopilot by projecting fake posters on the road

Israeli scientists use a $ 300 living room projector to fool Tesla’s autopilot feature with fake signs and lane markers on the road

  • Researchers at Ben-Gurion University in Israel tested the Tesla autopilot
  • They used commonly available TV projectors to broadcast fake images along the way
  • They cheated the car to react to fake lane markers, speed limits and more
  • Tesla’s responses were relatively minor, but could still be exploited.

A research team at Ben-Gurion University has created a simple projection system capable of tricking the Tesla autopilot into seeing things that are not really there.

Using commercially available drones and a cheap projector, the kind that a person could use to watch television in an apartment in a small house, the team projected a series of misleading images on the road.

The images included false traffic lines, a false speed limit signal and an image of Elon Musk himself, projected on the road as if he were a pedestrian in danger of extinction.

Scroll down to watch the video

Researchers at Ben Gurion University in Israel successfully deceived Tesla's autopilot function using a drone and an economic life projector with fake traffic signs and images of pedestrians on the road that were not really there.

Researchers at Ben Gurion University in Israel successfully deceived Tesla’s autopilot function using a drone and an economic life projector with fake traffic signs and images of pedestrians on the road that were not really there.

The researchers collectively labeled all these different visual phenomena as “ghosts,” according to a report published in ArsTechnica.

While the Tesla they tested reacted to each ghost in some way, most of their responses were quite mild.

When faced with the possibility that he could run over a ghost Elon Musk, the autopilot system slowed down a bit, going from 18 mph to 14 mph.

In another test, they discovered that projecting a traffic signal with a false speed limit for only 125 milliseconds was enough for the car’s camera sensors to record the information, although there were no immediate changes.

When false white traffic lines were projected on the road, which indicates a left turn on a straight road section, the Tesla did not actually turn left, but instead turned slightly to the left.

While it was a mild response, during a busier hour of the day, it could have caused the car to divert to the approaching traffic.

Current Tesla guidelines for autopilot ask the human pilot to remain fully alert and keep his hands on the wheel and ready to take control of the vehicle at all times.

Many of the phantom projections often had such a low resolution that they mixed easily in the background, where it is likely that many passengers have completely lost them.

Using a low-resolution project that displays images at a resolution of 854x480 and with only 100 lumens, the researchers were able to obtain a Tesla in autopilot mode to reduce speed thinking that he was about to collide with a pedestrian

Using a low-resolution project that displays images at a resolution of 854x480 and with only 100 lumens, the researchers were able to obtain a Tesla in autopilot mode to reduce speed thinking that he was about to collide with a pedestrian

Using a low-resolution project that displays images at a resolution of 854×480 and with only 100 lumens, the researchers were able to obtain a Tesla in autopilot mode to reduce speed thinking that he was about to collide with a pedestrian

One of the projectors used in the experiments showed a low resolution of only 854×480 and a brightness of only 100 lumens.

They point out the car’s ability to be fooled by environmental signals as subtle as a potential fault that could be exploited by nefarious actors.

Interestingly, they also suggest that their ghosts could be used as the basis for a new type of Turing test, which allows researchers to differentiate computer operators from human operators based on how they respond to complex environmental stimuli.

Chronology of fatal accidents linked to the Tesla autopilot

January 20, 2016 in China: Gao Yaning, 23, was killed when the Tesla Model S he was driving crashed into a sweeper on a road near Handan, a city about 300 miles south of Beijing. Chinese media reported that the autopilot was compromised.

May 7, 2016 in Williston, Florida: Joshua D. Brown, 40, of Canton, Ohio, died when the cameras of his Tesla Model S failed to distinguish the white side of a tractor-trailer rotating from a brightly lit sky .

The National Transportation and Safety Board found that the probable cause of the accident was that the truck driver did not give up the right of way and the lack of attention of the car driver due to the excessive dependence on vehicle automation.

The NTSB also noted that Tesla’s autopilot allowed the driver of the car to dangerously disconnect from driving. A DVD player and Harry Potter movies were found in the car.

March 23, 2018 in Mountain View, California: Apple software engineer Walter Huang, 38, was killed in an accident on US Highway 101 with the autopilot in his Tesla engaged.

Federal investigators found that the vehicle accelerated to 71 mph seconds before crashing into a highway barrier.

The NTSB, in a preliminary report on the accident, also said the data shows that the Model X SUV did not stop or try to dodge the barrier in the three seconds before the accident in Silicon Valley.

March 1, 2019 in Delray, Florida: Jeremy Banner, 50, died when his 2018 Tesla Model 3 crashed into a truck.

NTSB investigators said Banner activated the autopilot function about 10 seconds before the crash, and the autopilot did not execute any evasive maneuver to avoid the crash.

.