The California DMV has published the accident reports involving self-driving cars from the past year, beginning October 2014, and if there's one thing the reports reveal it's that humans are to blame for every crash, minor or major.
This falls in line with reports from earlier this year in May, when anonymous sources revealed that every Google self-driving car accident had so far been caused by human error, but details of the accidents hadn't really been explained at the time.
But now that the reports have been published by the DMV, details show exactly how each accident occurred, and there really is no doubt that human error played a key part in each and every incident. You can check out the reports here.
How well do you drive?
Though the DMV lists 9 reports in total published, there are 8 accident reports from the beginning of 2015 and these all involve Google's autonomous cars, though this shouldn't be too surprising as Google has more than 20 self-driving cars on the roads right now averaging about 10,000 self-driven miles per week.
The accident reports show that the majority of the collisions involve a car behind one of Google's self-driving cars not stopping in time or speeding up too fast at an intersection, and hitting the rear bumper of one of Google's car.
Another report explains how, while attempting to change lanes, a driver managed to hit a Google car, which wasn't changing lanes but was about to stop at a traffic light. Luckily, most of the accidents happened in low speed situations and no one has been really hurt.
But there is at least one accident that stands out, as it shows how Google's autonomous cars may actually be too cautious, and that in itself could cause a collision.
The report from February 26 explained that a car that "failed to come to a stop at the stop sign" at an intersection hit the rear right door and wheel of a Google car that was making its way through that intersection at the time.
While it is clearly the fault of the human driver in the other car, just before the actual accident happened, Google's car detected the speed and trajectory of the oncoming car and began breaking in response.
In this particular occasion, a collision could possibly have been avoided if Google's car had sped up in response instead of breaking, and in fact, the on board driver of the Google car had disengaged the self-driving mode when he realized the car had begun breaking, though that might just be the procedural response for the on board drivers when the car detects a possibly threat.
What this shows is that some accidents may be occur because the usual response of an autonomous car may be to break when it detects an oncoming car, but it also shows that self-driving cars also mostly make safer decisions than human drivers.
And if you're still worried about driver-less cars causing more accidents on the roads, Volvo has stepped up as first automaker to take responsibility for self-driving car accidents and we expect other car makers to follow.