8 years ago
2
Why are Moral Decisions so Important for Self-Driving Cars?
Self-driving cars are in the news again, after several surveys were released showing how people would want their computer-controlled car to react in a situation where it has to decide whether to save the driver or pedestrians. Most survey results suggested the driver would want to limit the amount of casualties, though we have seen sometimes that humans do not choose that option and end up killing more people to save themselves.
Continue Reading http://readwrite.com
Join the Discussion
Self driving cars don't really distinguish between a human and a tree or a cardboard cutout. Computers have no morals.
If you're trying to work out what a computer was feeling, you're doing it wrong. It wasn't.
First off, if they hope to sell any of these things they might want to keep their moral relativism to themselves. When people find out that their car might literally be programmed to kill them in certain scenarios, that manual drive is gonna look pretty damn good.
That said, this is just the Trolley Problem all over again. People have been going over questions like this for thousands of years, and I'm sure they'll still be going over them thousands of years from now.