The Moral Machine: Decide Who Lives in Tough Situations
Adam Rifkin stashed this in Self-driving Cars
Source: http://moralmachine.mit.edu
Stashed in: Awesome, Ethics, MIT TR, Morals, AI, Artificial Intelligence, Self Test
To save this post, select a stash from drop-down menu or type in a new one:
Top Reddit comment:
This has been posted before and has several shortcomings.
If you approach it systematically with a series of rules, it will credit you for "valuing" people differently that had no effect on your decision. For example, if you choose to swerve into a second crosswalk that has "no walking" to save people in a "walking" area, you will be penalized based on the makeup of the "no walking group". This is regardless of the fact that the makeup of the "no walking" group had no effect on your decision.
For this test to be accurate, it would need to present all traffic situations with all possible group varieties.
11:20 PM Oct 04 2016