Calling cars ‘self-driving’ is dangerous, say researchers

Volvo says you can sleep behind the wheel by 2021, but ‘ autonomous’ systems are not flawless

“Fully automated vehicles that can own the driving task from A to B, with no need for driver involvement whatsoever, won’t be available for many years to come,” said Matthew Avery of Thatcham research.  “Until then, drivers remain criminally liable for the safe use of their cars.” Photograph: Concept 26

“Fully automated vehicles that can own the driving task from A to B, with no need for driver involvement whatsoever, won’t be available for many years to come,” said Matthew Avery of Thatcham research. “Until then, drivers remain criminally liable for the safe use of their cars.” Photograph: Concept 26

 

We need to stop calling current electronic driver aids “autonomous” or “self-driving” because it is building up driver expectations of such systems to unsafe levels. That’s according to Thatcham Research, the UK-based organisation, funded by insurance companies, which tests and rates vehicle safety systems.

A case in point may well be that of Volvo’s promise this week that, by 2021, it will have on sale a version of the XC90 SUV fitted with what it calls its “Highway Pilot” system. This system, controlled by radar, laser and cameras, is so sophisticated, says Volvo, that it achieves Level 4 autonomy, and can be used while the driver and passenger are reading, tweeting, or even sleeping.

Such Level 4 autonomy is expected, at first, to be “geo-fenced” or restricted to specific roads, streets, and areas.

It’s in those finite definitions that the danger lies, says Thatcham. Matthew Avery, Head of Research at Thatcham Research, said: “We are starting to see real-life examples of the hazardous situations that occur when motorists expect the car to drive and function on its own. Specifically, where the technology is taking ownership of more and more of the driving task, but the motorist may not be sufficiently aware that they are still required to take back control in problematic circumstances.

“Fully automated vehicles that can own the driving task from A to B, with no need for driver involvement whatsoever, won’t be available for many years to come. Until then, drivers remain criminally liable for the safe use of their cars and, as such, the capability of current road vehicle technologies must not be oversold.”

Overblown names

Thatcham’s latest research paper has warned that using overblown names for driver assistance systems – Tesla’s AutoPilot and Nissan’s ProPilot were singled out for criticism – leads drivers to believe that these are the flawless robotic driving systems being discussed in the headlines. They’re not. They are, for the most part, glorified cruise control and lane departure systems which actually still require a high degree of driver supervision and attention.

It remains crucial that all drivers are alert and ready to take back full control at a moment’s notice

“It begins with how systems are named and described across carmaker marketing materials and the driver’s handbook,” said Avery. “Names like Autopilot or ProPilot are deeply unhelpful, as they infer the car can do a lot more than it can. Absolute clarity is needed, to help drivers understand when and how these technologies are designed to work and that they should always remain engaged in the driving task.”

James Dalton, director of general insurance policy at the Association of British Insurers, said: “Insurers are major supporters of efforts to get assisted and autonomous vehicles onto the roads. Given the part human error plays in the overwhelming majority of accidents, these technologies have the potential to dramatically improve road safety. However, we are a long way from fully autonomous cars which will be able to look after all parts of a journey and in the meantime, it remains crucial that all drivers are alert and ready to take back full control at a moment’s notice. Manufacturers must be responsible in how they describe and name what their vehicles can do, and the insurance industry is ready to hold them to account on this.”

Chilling test

Thatcham used a somewhat chilling test to illustrate the limitations of current driver assistance systems. It showed a Tesla Model S, running on its AutoPilot system, following another car. When that leading car pulled over to reveal a stationary (dummy) car in the road ahead, the Tesla’s systems couldn’t recognise it in time and the Model S ploughed through the (thankfully fake) car in what would have been an appalling accident in real-life conditions.

“The next three years mark a critical period, as carmakers introduce new systems which appear to manage more and more of the driving task. These are not autonomous systems. Our concern is that many are still in their infancy and are not as robust or as capable as they are declared to be. We’ll be testing and evaluating these systems, to give consumers guidance on the limits of their performance. The ambition is to keep people safe and ensure that drivers do not cede more control over their vehicles than the manufacturer intended,” said Avery.

“How carmakers name assisted systems will be a key focus – with any premature inference around automated capabilities being marked down. Automated functions that allow the driver to do other things and let the car do the driving will come, just not yet.”