Self driving cars will likely have to deal with the harsh reality of who lives and who dies.
Adam Rifkin stashed this in Ethics
Stashed in: Will Smith, Self-driving Cars, Tesla!, Self-driving Trucks
Top Reddit comment:
No programmer would ever have the car make utility calculations in a life-or-death scenario, because if you don't have enough information to avoid the situation, you don't have enough to act proactively during it. And that's assuming no software bugs!
You would never program a vehicle to swerve off a cliff, because what if there's a bug in the code and it triggers accidentally when a bird shits on the camera? Now you've just randomly murdered a family.
The car will always try to "just stop."
Swerving into legal empty space while braking is ok. That still falls under "just stop." The article is talking about the car deciding between hitting teenagers or elderly people, or between hitting people crossing against the light vs people crossing legally, or about throwing yourself off a cliff to avoid hitting a group of people. These situations are patently ridiculous.
This simple truck design might have prevented the fatal Tesla Model S crash:
Source: http://bgr.com/2016/07/05/tesla-model-s-crash-truck-design-europe/
Comments:
https://reddit.com/r/Futurology/comments/4rioxn/this_simple_truck_design_could_have_prevented_the/
10:12 PM Jul 11 2016