Electronic safety aids in cars are no panacea. Just ask a pilot

Aviation shows over-reliance on electronics can degrade skill of drivers and pilots alike

The UK government recently announced that it wants to introduce automated lane-keeping systems on to UK motorways

The UK government recently announced that it wants to introduce automated lane-keeping systems on to UK motorways


On June 1st, 2009, an Air France Airbus A330 crashed into the Atlantic ocean. Flying a route from Rio de Janeiro to Paris, the aircraft had, at first mysteriously, plummeted out of the sky and into the water, tragically taking the lives of all 228 passengers and crew on board.

On May 5th this year, Steven Hendrickson was killed when his Tesla Model 3 hit an overturned truck which was blocking part of a California freeway. That both incidents are tragedies is certainly true, but it’s also possible that there is some other connective tissue between them.

“Over-reliance on technology is one of the key issues that we’re really concerned about within aviation these days. On the one hand it was designed to make life easier and safer. But of course it has added layers of complexity. And that has been accompanied by a reduction in the depth and quality of training that takes place these days. So that has led to, if you like, less understanding of how the technology works.”

That’s according to Robert Scott, speaking to The Irish Times in his capacity as a fellow of the Royal Aeronautical Society. But his previous experience underlines the gravity of his words.

A retired Royal Navy captain, and a Fleet Air Arm pilot, Scott’s original day job was flying Buccaneer strike aircraft from the decks of the British aircraft carriers HMS Eagle and HMS Ark Royal. It’s widely acknowledged that carrier flight operations are among the riskiest and most skilful of all human endeavours, so clearly Scott is a man worth listening to when it comes to the interactions between technology and human skills.

What’s worrying is that he’s worried, worried about the dimming of those human skills as we all become more reliant on electronic controls. The skills of pilots are being eroded, he says (actually he uses the technical term ‘peripheralisation’ but the effect is the same) and much the same could be happening to drivers, given the seemingly relentless rise of electronic driver aids and so-called autonomous driving technology.

“The Boeings and Airbuses of the world will gloss over that and tell you that the aeroplanes are so reliable these days, pilots don’t need as much training as they used to,” Scott says. “And there’s a certain amount of truth in that. The industry as a whole has a very low accident rate and one that it can be proud of. But, simmering below the surface, we’re seeing more and more incidents, not necessarily accidents, but incidents that are indicating a worrying trend.

“There’s absolutely no doubt that the technology has contributed enormously to an increase in safety. So much so that here’s talk about autonomous-flight passenger flight just as there is with autonomous cars. That’s really sounding some alarm bells.”

Degraded skills

The problem, according to Scott, is that, operationally, airlines now rely on these systems to hit the kind of flight and timetable reliability that they advertise, but the constant use of electronic flight-control systems is starting to degrade the skill set of the pilots sitting behind the stick.

Once, a pilot would have known and understood, at a deep level, every single mechanical part of their aircraft – such knowledge, as stated by no less an authority than Chuck Yeager, the first person to officially fly faster than the speed of sound, was critical to remaining alive. Now though, the crews are essentially being trained as system operators, not stick-and-rudder pilots, and when the electronics fail, they lack the ability to snatch back control and bring the aircraft back safely.

That is what caught out Air France 447. The flight-deck crew simply didn’t believe what the computers were telling them, and the computers themselves were confused by an errant speed sensor. In a simpler, light aircraft, the problem would have been solved in an instant by any reasonably trained pilot, but on board the fearsomely complex A330, decisions were made on the basis of erroneous information and tragedy unfolded. Too much trust in the system led to disaster. Although we didn’t specifically discuss the recent accidents involving Boeing’s 737 Max aircraft, it’s clear that similar issues were at play.

That, then, is the connection between Air France and Tesla, or any carmaker espousing the efficacy of its ‘self-driving’ electronics. Such systems in cars are a long, long way behind the abilities of commercial airliner’s autopilot, yet – thanks in large part to marketing – people buy into them and believe the hype.

Steven Hendrickson had previously posted videos of himself in his Tesla, allowing the car to drive itself without him holding the wheel nor even keeping his feet near the pedals. Whether or not his Tesla’s ‘autopilot’ system was switched on at the time of the accident (and the California highway patrol has not confirmed one way or the other) it’s not implausible that blind trust in the car’s systems were a contributing factor. Tesla’s safety aids are not especially better nor worse than those of any other carmaker – it’s the blind trust in what are aids to a driver – not replacements for one – that is the issue.

“The UK government recently announced that it wants to introduce automated lane-keeping systems [ALKS] on to UK motorways. Here, yet more responsibility is ceded to the system, with the driver being able to conduct secondary tasks such as checking emails or watching content on the car’s infotainment screen,” says Matthew Avery, from the vehicle safety experts at Thatcham. “ALKS, however, are not yet capable of coping with all scenarios just as a good human driver would and therefore motorists will sometimes be required to take back control at short notice. This process of ‘coming back into the loop’ can take a dangerously long time, especially if the driver has become completely disengaged through system over-reliance.”

Avery did say, though, that we have a while to go before we reach the point of concern surrounding airline pilot skills. “Right now, use rates – where available – are low, less than 10 per cent for systems like adaptive cruise control. So we’re talking primarily about motorway travel. These systems will typically be used by older drivers in higher-specification cars – high milers with lots of experience.

“In effect, driver experience will not diminish to the point where we forget how to drive. The other issue from a human factors perspective is if you rarely do something you’re likely to be more attentive – look at young, nervous drivers who’ve yet to get the veneer of complacency. We could be looking at a generation where some never drive, and vehicles become so autonomous that we rarely need the skills to intervene. But that’s a long way off yet,” said Avery.

However, in the manner of a stitch in time saving nine, perhaps this is something we to which we ought pay more attention now. Perhaps driver training programmes need to be changed or upgraded to take account of the encroachment of electronic aids and to train drivers to be able to tell when such systems are not functioning as they should, and to snatch back control more quickly, and more easily.

Attentive driver

Electronic aids are a useful force-multiplier in terms of safety, but they are a long, long way from being a replacement for an attentive human driver.

On top of which, they are fallible in a both a reliability sense and a when it comes to taking appropriate actions – in the past month alone, this correspondent has experienced automated braking systems switching off because dirt got on to a camera lens or a radar unit, and an automated speed limit system slamming on the brakes on a motorway because its camera detected a 60km/h speed sign on an adjacent road. A mixture of electronic control and human complacency could be a dangerous one.

“It is a well-known fact that human performance deteriorates in low arousal situations, consequently, pilots are not likely to be at their best when operating in such an environment,” Scott says.

“This can be typical of much of the flying done by today’s pilots, where the same routes are flown day after day, with the autopilot engaged for most of the flight, the same flight modes displayed in the flight deck and no real challenges to the pilots’ knowledge, skills or attention.

“While this generally results in safe and efficient operation of the aircraft it does nothing to prepare the pilot for times when the untoward happens and he or she has an unfamiliar situation to deal with. The result is sometimes a time lag while the situation is assessed, and a course of action decided upon, which can result in an incident or accident.”

At 35,000 feet, there is at least some room for that time lag to be accounted for. At 120km/h, there is much less so.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection


Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.