The self-driving car is a fantastic idea, that it’s hoped will rid the world of accidents. Unfortunately the automated auto-mobiles seem to crash an awful lot.
The cars have been involved in a number of crashes. The problem is that, unlike humans, the robotic cars always obey the laws, without exception. This sounds great, but good luck trying to merge onto a motorway with traffic flying along well above the speed limit. The robot car’s law abiding nature means that it’s causing accidents.
The accidents, all thankfully small scrapes, have started to pile up leaving developers wondering whether they should teach cars to break the law to avoid accidents. However the ethical implications of building a car that can choose to break the law is a murky area for computer programmers.
The decision is one of number of thorny ethical issues that the automated car makers are wrestling with, for example should an autonomous vehicle sacrifice its driver by swerving off the road avoid killing a child? These are questions that need answering before the cars come to market.
Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh has said: “It’s a constant debate inside our group… and we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”
Last year, Rajkumar test drove his lab’s self-driving Cadillac SRX sport utility vehicle. The Caddy performed perfectly, except when it had to merge onto a highway and swing across three lanes of traffic in. The car’s cameras and laser sensors detected traffic in a 360-degree view but didn’t know how to trust that drivers would make room in the ceaseless flow of traffic, so the human driver had to take the wheel to complete the merge.
Rajkumar added: “We don’t want to get into an accident because that would be front-page news. People expect more of autonomous cars.”
However a study by the University of Michigan’s Transportation Research Institute has revealed that accident rates are twice as high as for regular cars. Driverless vehicles have never been at fault in accident. They tend to get hit from behind in slow-speed crashes by aggressive drivers not used to mechanical motorists that always obey the law.
More of a concept than a journalist, Tom Percival was forged in the bowels of Salford University from which he emerged grasping a Masters in journalism.
Since then his rise has been described by himself as ‘meteoric’ rising to the esteemed rank of Social Editor at UNILAD as well as working at the BBC, Manchester Evening News, and ITV.
He credits his success to three core techniques, name repetition, personality mirroring, and never breaking off a handshake.