Should a self-driving car kill the baby or the grandma? A modern take on the trolley problem for ethics

This study involved millions of people across the globe and shows how ethics varies across cultures.

This study by MIT media lab crowdsourced people’s decision about how self-driving cares should prioritise lives.

How will machines make moral decisions?
How do you quantify societies ethical expectations for machine programming?
Do we allow cars to make ethical decisions
And finally, should the car swerve (take action) or stay on course (inaction)?

Countries with more individualistic cultures are more likely to spare the young

number of people in harm’s way wasn’t always the dominant factor in choosing which group should be spared.

participants from individualistic cultures, like the UK and US, placed a stronger emphasis on sparing more lives given all the other choices—

More useful links and resources

at a covid trolley problem moral crossroad: the people who didn’t wear masks shouldn’t get the vaccine to punish their bad behavior since they don’t care anyways. on the other hand, they should get it because they clearly refuse to stop the spread of the virus any other way.

Examples of COVID trolley problems

How long should we continue to delay care to ensure we are doing what is best for all of our patients?,responsible%20for%20that%20person’s%20death.

How does pursuing herd immunity compare with lifting lockdowns, from an ethical point of view? 

Those of you familiar with the so-called Doctrine of Double Effect (or at least its modern counterpart, the trolley problem) will recognize the above distinction between intended means and merely foreseen side-effects; according to most interpretations of the Doctrine, the latter kind of harm can – under certain circumstances – be allowed; while the former kind of harm – intended means – can never be morally permissible.

Leave a Reply