this post was submitted on 29 Jul 2023
831 points (97.4% liked)
Memes
45759 readers
937 users here now
Rules:
- Be civil and nice.
- Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Kill fewer people now is obviously the right answer, and not very interesting.
What is interesting is that the game breaks already at junction 34, which is unexpectedly low.
So a more interesting dilemma would have been "would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.". Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes "do I trust the next 33-n people not to be psychos, or do I limit the damage now?". Even more interestingly "limiting the damage now" makes you the "psycho" in that sense...
The fact of the game never ending is what made the choice too easy, you're right.
EDITED
For this study you want sociopathy, not psychopathy. I can report from my wasted psych degree that sociopathy occurs in 1-2% of the population.
Binary probability tells us that if you repeat a 1% chance test 32 times, you have a 95% chance of never seeing it.
Don't pull the lever. Sorry for the ninja edit, I misread something.
I'm confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won't pull the lever, that's still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!
(Edit: unless "don't pull the lever" means killing that one person, because it isn't clear which is the default "no action" outcome. In which case, never mind.)
(Edit 2: if you know the 34th and last person might be a sociopath, you're best off if the first 27 people might also be sociopaths.)
You're probably right.
The thing that doesn't sit well with me about this sort of ethical reasoning is that it's really only oriented towards the ends. Is it ethical to even comply with such a game at all? If they put a gun to your head or hold the world hostage for an answer, they're basically forcing you to treat the situation as a pure math problem, which means they've determined the "right answer" by the framing of the question.
Better to have a "rogue AI moment" try and kill the experimenter.
I totally get that - my natural impulse is also to pull a Captain Kirk (Kobayashi Maru) or a Captain America (we don't trade lives). What is it about captains and that sort of thing? But IRL no-win scenarios do happen...