Computers second guessing humans causing car crashes

We’re halfway to self-driving cars but getting to where the driver is a passenger could be tricky


The recent scrape between a Google self-driving car and a bus in California caused the sort of media response normally reserved for a multi-fatality motorway pile-up. This despite the fact the car was barely doing 5km/h at the time.

Cynics fed off the event as an ominous preview of the perils that await as we mix humans and computers on our roads. Wherever the blame for this particular prang lies, the fact is that rule-abiding robots will struggle to anticipate the often illogical antics of humans behind the wheel.

Most of the world’s large car companies predict that by 2020 they will be able to produce and sell vehicles capable of driving themselves through city streets and along multi-lane highways.

There are still major technical challenges along the way. At the heart of most systems being tested today is a series of video cameras, radar and lidar systems combining with GPS linked to detailed maps, all feeding into a central computer that manipulates steering acceleration and braking. The format has been under development for well over a decade now, with test cars from the likes of General Motors, Ford, and VW competing in autonomous driving challenges as early as 2004. Many of the major advances were achieved thanks to research carried out under the auspices of the US government's defence procurement agency, Darpa.

READ MORE

It’s not just illogical actions by humans that have tested the limits of new systems. Heavy rain has affected visibility of the cameras. Similarly the programmers are still struggling to get these systems to differentiate between the various things we encounter on the road. This could vary from a misplaced traffic cones to a animals or even small children. Getting this right is very important when considering what level of mitigation should be deployed. For example, you would want to veer away from a child, even if it meant a collision with a parked car, but would you want the same response if it was a fox?

The motor industry has laid out a four-stage roadmap for the advent of self-driving cars. Level one is already upon us with function-specific automation, which intervenes with the steering or braking in emergency situations. Many new cars offer “city brake” systems that will prevent collisions below 80km/h, even if the driver fails to react in time.

The second level is also a common feature on modern cars, commonly known as “active cruise control”, where the system takes over maintaining the speed of the vehicle, braking and accelerating while maintaining a safe distance from the vehicles in front. Since the advent of lane-keeping technology on new cars five years ago it can also play a minor role in the steering of the car, setting off an alert if the car veers from the set route or even manoeuvring the car back into the centre of its lane. So far so good: these are driver aids but the ultimate responsibility and control still rests with the driver, who supposedly is focused on the road ahead.

Next up at level three is where the tech seems get interesting but the situation gets a little more risky. Here the car takes limited control of all driving functions including steering, but the driver must be ready to take over at any time. Early versions of these systems are already on the market and they work perfectly well in motorway conditions. Here the car will maintain speed, road position, distance from other cars and, married with high-definition Sat-Nav data – can even take decisions like changing lanes on motorways to speed up the journey. Estimates are that at the top end of this level drivers will only have to take control about 35 per cent of the time.

Finally we come to the ultimate goal, level four, where the driver becomes the passenger. Full autonomy requires high-definition, three-dimensional maps in order for vehicles to know precisely where they are. They will also need to communicate with others, and with infrastructure such as traffic lights, as they drive.

According to Ford's executive vice president of product development, and chief technical officer, Raj Nair, at this stage vehicles will have no steering wheel or pedals for drivers, so even if they wanted to intervene there would be no controls to grab.

To get to this level he says we will need high-definition mapping for all areas.

Speaking to The Irish Times, Nair seems to have reservations about the way level three systems will depend on drivers to be on hand to take over. He points to growing numbers of YouTube clips showing motorists getting up to all sorts of "scary antics" behind the wheel "without recognising the limited capabilities of the vehicles".

“The point is that the system may ask you to take back control and people are doing other things will take time to re-engage. I think we’ve all had that moment using normal cruise control when you are in cruise mode and the next thing you know you are closing really quickly on the car in front. It takes a few seconds before you react and apply the brakes. Now with adaptive cruise control [which applies the brakes automatically to maintain a set distance with the car in front] that’s one less thing you have to worry about. But then what else are you not paying attention to? And that’s at the level of automation we have right now,” said Nair.

“I think we need to be cautious about high-end level three systems and our position would be to hold off and go to level four.”

The risk is the mix of humans and computers. The Google crash was blamed on the fact the car calculated that the bus would stop, while the bus driver thought the car would. High-end computers can carry out fiendishly complicated analysis and complex scenario planning. Working out the thinking of motorists and bus drivers still needs work.

And in response to the crash, engineers pointed out that if it was a computer at the wheel of the bus as well, instead of a human, it’s likely there would have been no crash. It’s the fact that computers have to second-guess humans which is causing the problems. The halfway house between partial and full autonomy may be a tricky one to navigate in the coming years, with the likelihood of a few more prangs to come.