Google’s automated cars may very well save groceries over babies
Dec 21, 2015, 7:57 AM | Updated: 9:07 am
(AP Photo/Tony Avelar, File)
There’s a problem with driverless vehicles: They obey laws at all times.
Google’s automated vehicles have been in 17 reported crashes in about 2 million miles, Bloomberg reports. Most of the crashes, which were minor, were caused by the driverless vehicles.
The cars have been programmed to never break the law, KIRO Radio’s Dave Ross further explained. They are infinitely patient. That’s caused humans to hit the cars from behind as the cars edge forward in a less-than-human way, for example.
Related: Seattle’s passive-aggressive pedestrians can’t be trusted around self-driving cars
Is it a fundamental flaw in driverless cars?
Not necessarily, University of Washington Professor of Law Ryan Calo explained.
“It is a challenge,” he told Ross. “Nobody should reasonably expect driverless cars to be free from accidents. The hope is that given that so many of the traffic fatalities in America are due in large part to human error; driverless cars will cut down on accidents.”
That isn’t a bad trade off, Ross said. Thirty-thousands fender-benders would be better than 30,000 fatalities every year, he added.
But there could be reason for concern. Though the automated vehicles might be better at avoiding, say, a shopping cart or stroller, it may not know what to do when confronted with both, Calo said. If one of the vehicles chose a shopping cart over a stroller, that would spell the end of them.
“Because the headline would read: ‘Robot car kills baby to save groceries,'” he said.
If a driverless car is involved in a crash, who is legally at fault?
Though some believe that’s a difficult question, there is no doubt in Calo’s mind.
“If you’re a manufacturer like Google and build a product that’s supposed to do something and it doesn’t do it right, you are at fault,” he said.