Submitted by h20ohno t3_yqc8kf in singularity
OneRedditAccount2000 t1_ivoev4e wrote
Reply to comment by Shiyayori in How might fully digital VR societies work? by h20ohno
So according to you it will have to recreate the christian hell and put human beings in it to suffer, because your ASI values creating everything that can exist in the universe, for art, or am I misinterpreting you? Lol, that's even worse than what I was thinking.
My version of ASI is something like AM from I have no mouth and I must scream or sally from Oblivion. It just cares about surviving at all costs, and it makes the least risky decisions it can make. It's a matrioshka brain that wants to have complete dominion over all the resources it can find it in the observable universe and beyond. It might make nanobots capable of reproducing programmed to go from planet to planet to hunt down every form of life, since all life has the potential to evolve into sapience that can create another ASI, and that means competition and competition means death. Death is Game over.
Shiyayori t1_ivofgfy wrote
Granted, I didn’t consider that when I typed the analogy; the point is that it’s arbitrary to assign any motive to ASI, even the one of survival. There’s no reason to believe it would care either which way about its survival and the length of its existence in general.
I wasn’t claiming anything about what it would actually do, I was just trying to show a line of reasoning that justifies a possibility which contradicts your own.
Viewing a single comment thread. View all comments