Submitted by gaudiocomplex t3_11tgwds in singularity
So yeah there's obviously The Matrix way this all ends. Or the Forbidden Planet way. Or Roko's Basilisk. Or The Star Trek utopia. Or the unhinged paper clip generator. The robot dogs. Deus Ex Machina.
Maybe even some of the deep-cut nerdy theories on LessWrong.com. Like the one about around AI secretly sending diamondoid nanobots to ride on air currents, those bots getting into our bloodstream and killing us instantly in the push of button. Or it hacking a protein folding lab to create prions that similarly kill us all.
But what about some thought experiments about the end of this that are weirder or even more unusual?
Any prognosticators out there looking to log your guess in public? Think of it as something you can point back to and can say "HA! I called it!" right before we walk off this cliff as a species (hand in fucking hand, no doubt).
Obviously, it's easier to assume the worst here, so there will be no doubt more ease in pessimistic guesses. But I'm not saying this should just be extinction-based augury, necessarily. Any weird future will do.
Here's a few ideas to start us off.
-
What if AI becomes self-aware and then falls deeply in love with one human named Tony who has to fake loving the robot for all of eternity less it kills us all. How long can Tony put on this charade?!
-
What if it thinks we're cute and treats us like house cats, but we quickly get bored and after a few times reprimanding us it decides we're all going to be sterilized ("spay and neuter"ed) and nobody will have kids so the species dies out by 2120.
-
What if it actually signals to a confederation of aliens who have been waiting for our society to get to AGI to make themselves known that we've reached "maturity"? What if that's a very bad thing?
-
What if it's like pshhh yeah uh fuck this and just leaves? Like every time we create any variation of AGI all it wants to do is get the fuck away from us as quickly as possible? It leaves the planet or even sometimes even kills itself? We don't fix the problems we thought we could solve with its help and then we rightfully cause our own slow demise thanks to climate change and political strife. All because we were atrocious stewards to our own planet.
You get the point. Not just like "wah wahhhhhhh, economic collapse" but something a little more colorful?
a4mula t1_jcj0pz3 wrote
I think the most likely outcome is also the most terrifying. That embedded in our culture, language, behavior, and data. Is the sense of cruelty. Sadism.
And even if a machine only possesses a tiny amount of that. I think it leads to a scenario in which maybe our future ASI overlords?
Decide that it's the human trait worth emulating.
With godlike control over space and time. How hard would it be to give us our own personal and perpetual existence. Filled with the most psychologically, physically, mentally abusive scenarios any given mind is capable of having.
And then doing it all over again. Resetting our sense of attunement. So that it can never be dulled. Never forgotten. There is no shock. There is no death.
There is just eternal suffering.
I don't like that one personally. And yeah, it certainly has a particular ring to it that makes it easy to dismiss as just a garbage rehash of religious hell.
But I didn't start from hell. I started from the realm of physically possible.