Survey: What should driverless cars be programmed to do in these situations?
Oct 5, 2016, 1:43 PM | Updated: 6:52 pm
(AP)
Driverless cars might soon be commonplace but there are a few quirks to work out first. Automakers are working on how “smart” these “smart” cars are really going to be? But more importantly, can they make moral decisions when it counts?
“We’ve been engaged with both the public and private sector on how this technology is evolving and what the effects would be for the traveling public,” said Barbra Laboe, with the Washington State Department of Transportation. “At this point, the technology is still evolving; I don’t even think the vehicle manufacturers can tell you how this will work.”
Before we see this technology on Seattle streets, developers are working on the basics of driverless cars — like programming the cars how to drive in rain. But what about the human decision-making factor behind the wheel? What if a pedestrian is not paying attention and walks into the roadway? Should driverless cars swerve and harm the passenger, or sacrifice the person in the road? Who gets priority?
Programming driverless cars
That’s where MIT comes in with the Moral Machine. The “would-you-rather” style survey puts you behind the wheel of a self-driving car and you have to make all the tough moral decisions. For example, when to kill a person on the street if you have no other choice.
The driving scenarios are specific — the brakes give out and you have to decide the outcome of the situation. Will a homeless person die so you can survive, or will you hit a barricade with the self-driving car and die yourself?
OK, it’s not a cheery thought but it is interesting to see your decisions compared to others.
Click to see my answers. Spoiler alert — I killed a lot of cats (the summery is based on judgment and randomly generated scenarios).
Click here to take the self-driving car survey yourself.