Ethical Question: Should a Self-Driving Car Kill You to Save Other People?


(Photo: Roman Boed)

This is absolutely fascinating!

The classical ethical dilemma goes something like this:

A train is about to crash into a bus full of people. If you do nothing, it will do so and kill them. If you switch the tracks, the train will instead hit and kill only one person. Do you switch the tracks?

Now let's update that dilemma and hand it over to a robot. Olivia Goldhill writes at Quartz:

Imagine you’re in a self-driving car, heading towards a collision with a group of pedestrians. The only other option is to drive off a cliff. What should the car do?

If you're the passenger, then you have a lot at stake in the decision that your robotic car makes. What should you do? I'm not sure, but psychological researchers led by Jean-François Bonnefon from the Toulouse School of Economics surveyed 900 people to ask them what they thought the car should do:

They found that 75% of people thought the car should always swerve and kill the passenger, even to save just one pedestrian.

That's very noble of them. But according to Helen Frowe, a psychology professor at Stockholm University, it can get more complicated:

For example, a self-driving car could contain four passengers, or perhaps two children in the backseat. How does the moral calculus change?

If the car’s passengers are all adults, Frowe believes that they should die to avoid hitting one pedestrian, because the adults have chosen to be in the car and so have more moral responsibility.

Although Frowe believes that children are not morally responsible, she still argues that it’s not morally permissible to kill one person in order to save the lives of two children.

-via Marilyn Bellamy

Should a car driving you alone sacrifice you to save two adult strangers?





I can't imagine rather wanting two other people to die just so I could live.

When I drive, I have chosen to take on the responsibility of operating a machine capable of killing people. It only seems fitting that, if possible, for negative consequences to fall upon me rather than upon innocent pedestrians.

If they're jaywalking, then that's a different story. In that situation, I say run them all down.
Abusive comment hidden. (Show it anyway.)
I honestly don't see this as a moral problem but rather a technical/design problem.

For instance, in the example given, the car knows about both dangers long enough beforehand to be able to make a decision. Why couldn't the car be designed to be able to stop properly without hurting anyone, then? Or it picks a fourth path: it decides that running into a wall at a reduced speed would be better since the airbags and seat belts will protect the passengers sufficiently at the speed it knows it can reduce to given the circumstances...

As long as the car is capable of seeing into its future far enough, it should never have to make a moral decision, merely best-decision-at-the-moment is enough to keep everyone alive, as long as it has the technical safety design to implement whatever the best decision requires. Short of another driver's active malicious interference, that is.

I'd be more worried about being in such a vehicle if all the bugs haven't been worked out of the programming yet.
Abusive comment hidden. (Show it anyway.)
When getting in a car, everybody should hope not to be involved in an accident with deadly consequences during the trip...
However, accidents happen and people die in accidents, even though it is rather improbable, a residual risk remains that an accident happens and a person will die.
However people use cars accepting this little chance of people dying... In germany in 2014 3.377 died in traffic-accidents...
On the other hand and people take part in lotteries, with a much smaller chance to winm as there are much less people winning 1.000.000 bucks or more, each year.
In my opinion self driving cars should increase the safety of the traffic, but getting in a self driving car will still comprise the residual risk of the car kill you or someone else in a traffic accident.
Finally i would not like the car to decide who is going to die...
Abusive comment hidden. (Show it anyway.)
  2 replies
Forget self-driving: I'd be happy with a car that can change its own spark plugs, because some idiot engineer at Ford decided it would be cute to put one of my Escort's spark plugs (and there are only 4 of them!) right under the alternator, so you can't get to that one plug without first removing the alternator, the belt, the brackets, and the electrical plugs that go into the alternator. Plus... that spark plug is accessed through a narrow trench of sharp metal corners from the head cover, so no matter what you do, you WILL cut your knuckles on those corners and bleed all over the engine bay.
I hate you, idiot engineer at Ford.
Abusive comment hidden. (Show it anyway.)
The "smart cars" should preserve the life of it's passengers then itself. There should be no morality beyond that.
(serious part of this post ends here)

If there are no passengers and no witnesses then it can proceed on its killing spree!
Abusive comment hidden. (Show it anyway.)
Login to comment.
Click here to access all of this post's 12 comments




Email This Post to a Friend
"Ethical Question: Should a Self-Driving Car Kill You to Save Other People?"

Separate multiple emails with a comma. Limit 5.

 

Success! Your email has been sent!

close window
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
 
Learn More