Researchers from Israel’s Ben Gurion University have demonstrated a special ‘hacking attack’, in which an image of a stop sign is displayed on an electronic billboard for several milliseconds. The systems in the cars respond to this by braking. The phenomenon is called a ‘split-second phantom attack’.
The researchers tested two advanced self-driving systems, that of a Tesla Model X and Mobileye 630. In addition to the image of a stop sign on a billboard that was displayed very briefly, the researchers also projected an image of a pedestrian on the road with a beamer. In both cases the car slowed down immediately.
Because the image can be seen so briefly, people do not notice that the car detects danger, and such a manipulated billboard can hardly be turned off preventively.
According to the researchers, there is a way to train the self-driving systems so that they are no longer sensitive to these manipulations. They have developed a system that they call “GhostBusters”, which must be able to detect, based on, among other things, a neural network whether an object is real, ie a real pedestrian or a physical stop sign, instead of a projection. The system would do that based on light, context, area and depth.
Tesla will soon release a software update to its self-driving Autopilot feature. According to Tesla boss Elon Musk, the update is ‘the fully self-driving version’ of the system. The fully self-driving function does not work with older Tesla cars, which need a hardware upgrade first.