Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
OneRedditAccount2000 OP t1_ir9rezv wrote
Reply to comment by Zamorak_Everknight in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
There have been nuclear disasters that have affected the well being of enough people. And we were one button away from ww3 (Stanislav Petrov) once.
And you're certainly ignoring the fact that the reason why ww3 never happened has a lot to do with the fact that MAD was always a thing since more than one group of people/country started to make and test nukes. .
In this scenario one group invents ASI first, which means they have a clear advantage over the rest of humanity that doesn't yet have it and can't fight back against it. The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.
ASI can create autonomous slave workers, so the group has no incentive to sell you ASI because they're better off keeping it to themselves and getting rid of everyone else that also wants it.
Zamorak_Everknight t1_irc8t3e wrote
>The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.
How... is that the next logical step?
OneRedditAccount2000 OP t1_ird7dur wrote
Because they want to rule/own the world and live forever? Can you do that if there are states? Don't you need to live in an environment where you're not surrounded by enemies to pull that off? lol
I'm not saying they'll necessarily kill everybody, only those that are a threat. But when you have a world government that's controlled by you, inventor of the ASI and all your friends, if you can even get there without a nuclear war, won't you eventually want to replace the 8 billion biological human beings with something else?
The answer is literally in the text you quoted
Viewing a single comment thread. View all comments