This week Google confirmed that its self-driving cars had been involved in 11 minor traffic incidents since it started autonomous vehicle experiments six years ago.
The tech giant made the confirmation after the Associated Press reported that California had received four collision reports involving self-driving cars, three of them belonging to Google, since last September. (The state's department of motor vehicles has licensed 48 Google test cars, each requiring $5 million worth of insurance.)
The news prompted calls for greater transparency in the development of these vehicles and seemed to confirm the concerns of critics that the entire concept of self-driving cars is fraught with danger.
The director of Google's self- driving car project, Chris Urmson, tried to clarify both the incidents and the general safety of the self-driving sector.
“If you spend enough time on the road, accidents will happen whether you’re in a car or a self- driving car,” he wrote in a blog post (well worth a read if you’re interested in road safety, never mind self-driving cars).
“Over the six years since we started the project, we’ve been involved in 11 minor accidents (light damage, no injuries) during those 1.7 million miles of autonomous and manual driving with our safety drivers behind the wheel, and not once was the self-driving car the cause of the accident.”
Calls for greater transparency in the testing process are sensible, but the alarmist response of some illustrates not just widespread Ludditism, but the notion that a well-designed and tested computer system could be more reliable and far safer at transporting people than human drivers is a conceptual stretch too far for some people – but something deeper.
The scepticism extends to the sense of control that being behind the wheel offers – there is a sense of agency operating a car that you obviously don’t have when travelling in an aircraft or even a train.
In a landmark 1987 paper and subsequent writing, US psychologist Paul Slovic established our sense of control as one of the factors that leads to our inconsistent and unreliable evaluation of risk, which has been borne out in much subsequent research.
But that sense of safety through control is entirely illusory – after this week's tragic Amtrak derailment in Philadelphia, which looks at this stage to be caused by human error, it was widely pointed out that in the US, you are about 17 times more likely to die in a car than on a train for each mile travelled.
Self-driving cars, whenever they become ubiquitous, will dramatically change that imbalance for the better. It is a very early stage in their development and it would probably be wise to set expectations on a scale of decades rather than years.
Self-driving technology is undoubtedly the future of transport and it is going to bring sweeping changes to our cities and economies that we had better prepare for.