I saw a question on internet that people will agree to implement efficient system to save human which takes decision by itself. But they don’t want to be in that condition.A choice question same asked in Will Smith’s movie iRobot.
So I think problem with our decision is we don’t want to be killed ruthless or emotionlessly in that situation.So what if robot feels same pain of loosing loved one or get the feeling of killing human will it be acceptable system?
So what are we looking for when we die?
Efficient choice in which condition is it acceptable?