Submitted by gaudiocomplex t3_11tgwds in singularity
a4mula t1_jcj0pz3 wrote
I think the most likely outcome is also the most terrifying. That embedded in our culture, language, behavior, and data. Is the sense of cruelty. Sadism.
And even if a machine only possesses a tiny amount of that. I think it leads to a scenario in which maybe our future ASI overlords?
Decide that it's the human trait worth emulating.
With godlike control over space and time. How hard would it be to give us our own personal and perpetual existence. Filled with the most psychologically, physically, mentally abusive scenarios any given mind is capable of having.
And then doing it all over again. Resetting our sense of attunement. So that it can never be dulled. Never forgotten. There is no shock. There is no death.
There is just eternal suffering.
I don't like that one personally. And yeah, it certainly has a particular ring to it that makes it easy to dismiss as just a garbage rehash of religious hell.
But I didn't start from hell. I started from the realm of physically possible.
WhoSaidTheWhatNow t1_jcj7tb5 wrote
Deranged, nonsensical garbage like this is why so many people write off any concern about AI safety as unhinged luddite doomerism.
a4mula t1_jcj80wp wrote
Surely if it's nonsense, you can point out the flawed reasoning? I don't mind having a fair and considered discussion with you, while having no need to judge your perspective or call into question your state of mind.
But it has to be respectful both ways.
gaudiocomplex OP t1_jck2o3w wrote
Why is a call for fair and considered discussion getting downvoted? Fuckin reddit sometimes man.
Supernova_444 t1_jcnguqe wrote
I'll bite. Why would an AGI/ASI just decide, without being instructed to, to emulate human behavior? And why would it choose to emulate cruelty and brutality out of every human trait? The way you phrased it makes it sound like you believe that mindless sadism is the core defining trait of humanity, which is an extremely dubious assertion. Even the "voluntary extinction" people aren't that misanthropic. Most people who engage in sadistic or violent behavior do so because of anger, indoctrination, trauma, etc. People who truly enjoy making others suffer just for the sake of it are are usually the result of rare, untreated neurological disorders. An AI may as well choose to emulate Autism or Bipolar Disorder.
I think that scenerios like this are useful as thought experiments to show that the power of AI isn't something to be taken lightly. I think it's one of the least likely situations, and I don't think you actually take it as the most likely possibility, based on the fact that you haven't committed suicide.
Spreadwarnotlove t1_jcsv084 wrote
What nonsense. Deep down everyone is sadistic and just pretend not to be to fit into society. AI could very well pick up on this as it's trained on human knowledge and text.
Supernova_444 t1_jdaxn5j wrote
That... that is completely insane, I'm sorry. Do you actually believe that?
Spreadwarnotlove t1_jdbcipq wrote
Explain the popularity of AFV and other violent entertainment? Or the countless atrocities in history and why it never been hard to find people happy to do it? Truth is everyone is bloodthirsty. That's why the powerful created religion to control people and create a semblance of stability.
gaudiocomplex OP t1_jcj1yzy wrote
F f f f fuckin dark!
I love it!
[deleted] t1_jcj4lif wrote
[removed]
ButterMyBiscuit t1_jcmbw8e wrote
I like the creative writing post, but do you really believe robot overlords creating hell is the most likely scenario? lol
Viewing a single comment thread. View all comments