Cars don’t kill people, people kill people. So let’s take the people out of the loop, KIRO Radio’s Dave Ross says.
That’s exactly what Google is trying to do with its driverless cars, which recently received a letter from regulators stating that the software that controls the vehicles can be considered a driver.
You might enjoy: Don’t call it ‘yoga’ if you remove the Hindu from it
The positive, Dave says, is that if Google is the driver, people would no longer be held accountable for crashes.
Former Attorney General Rob McKenna says that may be true, but it depends on what equipment is in a car. If a driver can take control at any point in time, it may complicate the issue. A human who takes control of a vehicle before a crash could be considered at fault.
Google would like to see cars become completely autonomous, as that would help reduce the potential for error. After all, humans are the primary cause of deadly crashes. The latest data from the National Highway Traffic Safety Administration shows that 32,675 people died in crashes in 2014, just a few hundred less than 2013.
If cars did become completely autonomous without any allowed human interaction, all liability would be placed on the software, Dave points out. If that is the case, driverless vehicles would be like fly paper for personal injury lawyers, because the company responsible for the software has deep pockets.