oraclesofnorway:

MIT GAME ASKS WHO DRIVERLESS CARS SHOULD KILL

NO BRAKES ON THE MORAL QUANDARY TRAIN

A Moral Quandary For A Self-Driving Car

Should the self driving car with broken brakes stay the course and hit the jaywalking girl, female athlete or female executive–or should it swerve and hit the people legally crossing the street: a boy, male athlete, male executive or elderly man?

When a robot has to kill, who should it kill? Driverless cars, the future, people-carrying robots that promise great advances in automobile safety, will sometimes fail. Those failures will, hopefully, be less common than the deaths and injuries that come with human error, but it means the computers and algorithms driving a car may have to make very human choices when, say, the brakes give out: should the car crash into five pedestrians, or instead adjust course to hit a cement barricade, killing the car’s sole occupant instead?

That’s the premise behind “Moral Machine,” a creation of Scalable Corporation for MIT Media Lab. People who participate are asked 13 questions, all with just two options. In every scenario, a self-driving car with sudden brake failure has to make a choice: continue ahead, running into whatever is in front, or swerve out of the way, hitting whatever is in the other lane. These are all variations on philosophy’s “Trolley Problem,” first formulated in the late 1960s and named a little bit later. The question: “is it more just to pull a lever, sending a trolley down a different track to kill one person, or to leave the trolley on its course, where it will kill five?” is an inherently moral problem, and slight variations can change greatly how people choose to answer.

Source: http://www.popsci.com/mit-game-asks-who-driverless-cars-should-kill