Can self-driving cars really be fooled by pen and paper?

No need for hackers – just tampering with road signs seems to do the trick

Picture the scene: you’re driving along in your shiny new box on wheels, and letting the car’s electronics take most, if not all, of the strain. The cruise control is cruising, the radar is scanning, and the camera mounted on the windscreen is keeping an eye out for speed limit signs.

Suddenly, your car brakes hard for no apparent reason, causing a near-collision with the car driving behind you, and a sudden barrage of honked horns and angrily flashing headlights. What happened? While cruising at a set 120km per hour on the motorway, your car’s camera system has spotted a 50km/h limit on an adjacent slip road, panicked, sent a message to the automated cruise control, which then threw out an anchor to slow you down to what it thought was the legal limit.

Not science fiction, not future-gazing, this actually happened, to this correspondent, while driving a Ford S-Max fitted with the new active speed limiter system, several months ago. It’s a minor incident in a busy week, but one that holds potential dangers for the future of autonomous cars. It’s not necessary for them to have their software and electronic security compromised if people want to cause them to malfunction. All it takes is a packet of Sharpies and some sticky tape.

That's according to researchers from the University of Washington, who have published a paper entitled Robust Physical-World Attacks on Machine Learning Models. This, in its simplest form, means not going after a robot car's lines of code, but messing with road signage, which is crucial to how a self-driving car sees and understands the world around it.

READ MORE

The hit rate

The team altered road signs by either very subtly vandalising them, adding extra areas of white (using sticky tape) to the sign, or by printing out a completely new sign and sticking it over the top of an existing one, such as changing a stop sign to a merge right sign. Depending on the weather and light conditions, the vandalised or slightly altered signs caused the sensors to make incorrect decisions 67-100 per cent of the time, while the over-printed sign worked 100 per cent of the time.

“In this work, we explore the following key question: ‘Is it possible to create robust and subtle adversarial perturbations on real-world objects?’ We focus on road sign classification due to their critical function in road safety and security” said the paper. “If an attacker can physically and robustly manipulate road signs in a way that, for example, causes a stop sign to be interpreted as a speed limit sign by a camera-based vision system, then that can lead to severe consequences.”

Vulnerable side

The study reveals a previously unconsidered vulnerable side to autonomous vehicles. Researchers and engineers have been concentrating on preventing electronic hacking of the car’s onboard systems, assuming that malicious types would want to take control of a car and then either cause carnage or hold it to ransom. This new research reminds us all that often hacking is not a major criminal endeavour, but the simpler setting of a prank or trick just for the heck of it. Messing with road signs in this manner could well cause an autonomous car’s systems to either make an incorrect decision, or simply to overload and shut down if it sees too many conflicting instructions.

One potential way around this is to follow Nissan's lead, by putting humans back into the autonomous car loop. The Japanese car company has teamed up with Nasa to develop a new system for self-driving cars that uses a mission control-style setup, where human experts can intervene via an internet connection, and help a robotic car find its way around unfamiliar terrain, or in this case, vandalised road signs.

Computers are good at a lot of things, but when it comes to spotting the wrong sign compared to the right sign, the human eye is still the best equipment around.

Neil Briscoe

Neil Briscoe

Neil Briscoe, a contributor to The Irish Times, specialises in motoring