(Photo: Roman Boed)
This is absolutely fascinating!
The classical ethical dilemma goes something like this:
A train is about to crash into a bus full of people. If you do nothing, it will do so and kill them. If you switch the tracks, the train will instead hit and kill only one person. Do you switch the tracks?
Now let's update that dilemma and hand it over to a robot. Olivia Goldhill writes at Quartz:
Imagine you’re in a self-driving car, heading towards a collision with a group of pedestrians. The only other option is to drive off a cliff. What should the car do?
If you're the passenger, then you have a lot at stake in the decision that your robotic car makes. What should you do? I'm not sure, but psychological researchers led by Jean-François Bonnefon from the Toulouse School of Economics surveyed 900 people to ask them what they thought the car should do:
They found that 75% of people thought the car should always swerve and kill the passenger, even to save just one pedestrian.
That's very noble of them. But according to Helen Frowe, a psychology professor at Stockholm University, it can get more complicated:
For example, a self-driving car could contain four passengers, or perhaps two children in the backseat. How does the moral calculus change?
If the car’s passengers are all adults, Frowe believes that they should die to avoid hitting one pedestrian, because the adults have chosen to be in the car and so have more moral responsibility.
Although Frowe believes that children are not morally responsible, she still argues that it’s not morally permissible to kill one person in order to save the lives of two children.
-via Marilyn Bellamy