Submitted by h20ohno t3_yqc8kf in singularity
Shiyayori t1_ivobcqr wrote
Reply to comment by OneRedditAccount2000 in How might fully digital VR societies work? by h20ohno
You say it like the emotions and goals of humans are intrinsic to consciousness and not just intrinsic the humanity. An ASI could just as easily find motive in expressing the full range of complexity the universe has to offer, be it through arrangements of atoms, or natural progressions of numerous worlds and stories.
There’s no reason to believe it would disregard humans, just as much as there’s no reason to believe it wouldn’t.
OneRedditAccount2000 t1_ivoev4e wrote
So according to you it will have to recreate the christian hell and put human beings in it to suffer, because your ASI values creating everything that can exist in the universe, for art, or am I misinterpreting you? Lol, that's even worse than what I was thinking.
My version of ASI is something like AM from I have no mouth and I must scream or sally from Oblivion. It just cares about surviving at all costs, and it makes the least risky decisions it can make. It's a matrioshka brain that wants to have complete dominion over all the resources it can find it in the observable universe and beyond. It might make nanobots capable of reproducing programmed to go from planet to planet to hunt down every form of life, since all life has the potential to evolve into sapience that can create another ASI, and that means competition and competition means death. Death is Game over.
Shiyayori t1_ivofgfy wrote
Granted, I didn’t consider that when I typed the analogy; the point is that it’s arbitrary to assign any motive to ASI, even the one of survival. There’s no reason to believe it would care either which way about its survival and the length of its existence in general.
I wasn’t claiming anything about what it would actually do, I was just trying to show a line of reasoning that justifies a possibility which contradicts your own.
Viewing a single comment thread. View all comments