Fooling self-driving cars by displaying virtual objects

Pierluigi Paganini October 19, 2020

Researchers from the Ben-Gurion University of the Negev demonstrated how to fool self-driving cars by displaying virtual objects.

A group of researchers from the Ben-Gurion University of the Negev demonstrated that it is possible to fool self-driving cars by displaying virtual objects (phantoms).

The experts define as phantom a depthless visual object used to deceive ADASs and cause these systems to perceive it as real. A phantom object can be created by attackers by using a projector or be presented via a digital screen (e.g., billboard).

Boffins tested two commercial advanced driver-assistance systems (ADASs) belonging to Tesla X (versions HW2.5 and HW 3.0) and Mobileye 630, they were able to trick these systems by displaying “phantom” virtual objects in front of the 2 vehicles.

The researchers were able to simulate the presence of virtual objects, such as virtual road signs along with an image of a pedestrian displayed using a projector or a digital billboard, in front of the self-driving cars that interpreting them as real. In tests conducted by the researchers the depthless object is made from a picture of a 3D object (e.g., pedestrian, car, truck, motorcycle, traffic sign). 

“We demonstrate how attackers can apply split-second phantom attacks remotely by embedding phantom road signs into an advertisement presented on a digital billboard which causes Tesla’s autopilot to suddenly stop the car in the middle of a road and Mobileye 630 to issue false notifications.” reads the post published by the researchers. “We also demonstrate how attackers can use a projector in order to cause Tesla’s autopilot to apply the brakes in response to a phantom of a pedestrian that was projected on the road and Mobileye 630 to issue false notifications in response to a projected road sign.”

Experts also tested split-second phantom attacks that uses a phantom that appears for a few milliseconds and is treated as a real object/obstacle by an ADAS.

Below the minimal duration that a phantom needs to appear in order to fool the ADAS.

self-driving cars

Self-driving cars can be fooled by displaying virtual objects, in a real-world scenario, this attack could result in accidents and traffic jams.

The virtual objects triggered a response of the ADAS systems, in the case of Tesla, the vehicle stopped in 0.42 seconds, while Mobileye 360 stopped in 0.125 seconds.

The researchers also proposed countermeasures, dubbed GhostBusters, to prevent this attack such as the use of the camera sensor. The GhostBusters measure implements a “committee of experts” approach and combines the results obtained from four lightweight deep convolutional neural networks that allow analyzing each object based on its light, context, surface, and depth.

“We demonstrate our countermeasure’s effectiveness (it obtains a TPR of 0.994 with an FPR of zero) and test its robustness to adversarial machine learning attacks.” continues the post.

Unlike other attacks against self-driving cars devised by other teams of experts, this attack requires less expertise and fewer resources.

The full research paper, which includes technical details about the study, is available here.

[adrotate banner=”9″][adrotate banner=”12″]

Pierluigi Paganini

(SecurityAffairs – hacking, Iran)

[adrotate banner=”5″]

[adrotate banner=”13″]



you might also like

leave a comment