Mercedes set to take legal responsibility for ‘autonomous’ car crashes

Carmaker indemnifies drivers using ‘Level 3’ software in Germany against accidents

A Mercedes test vehicle  on the M50: the carmaker  has announced that it will bear the legal responsibility if one of its cars running so-called Level 3 autonomous driving software crashes

A Mercedes test vehicle on the M50: the carmaker has announced that it will bear the legal responsibility if one of its cars running so-called Level 3 autonomous driving software crashes

 

Mercedes has announced that it will bear the legal responsibility if one of its cars running so-called Level 3 autonomous driving software crashes. The German carmaker is taking the legal risk as it believes that doing so will remove a major stumbling block in the development of self-driving cars.

Currently, many cars run Level 2 driver assistance systems – these are gadgets such as radar-guided cruise control, and camera systems that can keep your car straight in a lane on a main road. However, these are not autonomous systems and require the driver to keep their hands on the wheel and their eyes on the road at all times. They are driver adjuncts, not replacements.

Level 3 is a bit different. In theory, a Level 3 car can actually drive itself for long stretches of time, on the right roads, and in the right conditions. Those last 14 words of that sentence are a bit of a kicker though, because while a driver might well be able to take their eyes and hands off the situation while a Level 3 car is in charge of itself, there will all-but-inevitably come a time when the car finds a situation it can’t cope with, and it passes back control to the human occupant. If you’re glued to the latest Netflix drama when that happens, you may not be able to judge the situation fast enough to avoid an accident.

Mercedes says that the technology will give drivers at least ten seconds warning before it passes back control in any situation

Nonetheless, Mercedes has been given German federal approval to roll out its Level 3 systems this year, and they will be available first on the S-Class, and eventually on the EQS electric car. Mercedes’ Drive Pilot will, at first, only work at speeds of up to 60km/h and on suitable motorways and major roads, so it’s really an upgraded “traffic jam assistant” of the type we’ve already seen. That said, Mercedes is sufficiently confident in the technology to take on the legal responsibility.

“With Drive Pilot, our Level 3 conditionally automated driving system, our customers gain the most valuable asset – time. In a first step, we are offering this world-leading technology to our customers in Germany, but will be rolling it out in the US as well by the end of this year if the legal and regulatory framework allows,” Mercedes’ chief technology officer Markus Schäfer said.

Mercedes says that the technology will give drivers at least ten seconds warning before it passes back control in any situation. While that means you can’t have a snooze while the car drives, it does – legally – mean that in Germany you’ll be able toe scroll your timeline, read a book, or watch TV behind the wheel.

‘Unprecedented opportunity’

The move by Mercedes to accept legal responsibility for its cars’ autonomous actions is part of a fast-changing legal landscape for vehicle technology. A recent conclusion of the Law Commissions for England and Wales, and Scotland, was that drivers and users of autonomous cars should not be held responsible if the software and hardware of those cars makes a mistake, crashes, or triggers some other kind of incident.

“We have an unprecedented opportunity to promote public acceptance of automated vehicles with our recommendations on safety assurance and clarify legal liability. We can also make sure accessibility, especially for older and disabled people, is prioritised from the outset” said Nicholas Paines QC, Public Law Commissioner and one of the authors of the report.

The report points out the dangers of the ‘grey area’ in which many carmakers are currently dabbling

Until now, it had been assumed that, following the precepts of the Vienna Convention, a driver or user of an autonomous car would be responsible for any incidents involving that vehicle, simply because they were the ones who sat into it, and pressed the button that made it go. The Vienna Convention, recognised by most nations, holds a driver responsible for what their vehicle does, rather than the company that made the car, except in the case of accidents caused by manufacturing or safety defects.

What the UK’s Law Commissioners are recommending - and it is merely a recommendation, albeit a highly influential one, globally as well as locally, is that this legal framework needs to be changed and changed utterly for the future of autonomous vehicles. The report states that: “The person in the driving seat would no longer be a driver but a ‘user-in-charge’. A user-in-charge cannot be prosecuted for offences which arise directly from the driving task. They would have immunity from a wide range of offences – from dangerous driving to exceeding the speed limit or running a red light.” However, the user would not be totally off the hook for offences, with the report’s authors saying that users would still be liable for some responsibilities, including “carrying insurance, checking loads or ensuring that children wear seat belts.”

The report also recommends that a specialist agency should be established, notionally called the Authorised Self-Driving Entity (or ASDE), which should carefully regulate the use and sale of autonomous vehicle technology, including having the ultimate say on whether a vehicle qualifies as autonomous or not.

On that point, the Law Commissioners are strict – a car is either fully autonomous and capable of driving itself in all circumstances, or it isn’t. The report points out the dangers of the “grey area” in which many carmakers are currently dabbling, with systems that are actually only assistants to human drivers being sold and labelled as self-driving.

“The Law Commissions recommend new safeguards to stop driver assistance features from being marketed as self-driving. This would help to minimise the risk of collisions caused by members of the public thinking that they do not need to pay attention to the road while a driver assistance feature is in operation” says the report, and its authors emphasise the need for “a bright line distinguishing fully-autonomous technology from driver support features, a transparent process for setting a safety standard, and new offences to prevent misleading marketing.”

US test case

The report comes at a time when autonomous technology, and the current driver assistance systems which often purport to be autonomous, are coming under ever-greater legal scrutiny. Indeed, the whole notion of who exactly is to blame if an “autonomous” vehicle crashes and kills someone is about to be tested in US courts.

Prosecutors representing the state of California have brought charges of vehicular manslaughter against Kevin George Aziz Riad.

The prosecutors say that while his Tesla Model S saloon was being driven using the car’s so-called AutoPilot system, it ran through a red light and struck another vehicle. In this case, that vehicle was a Honda Civic, and the two occupants – Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez – both died at the scene of the crash. Mr Riad and his passenger sustained light injuries.