Submitted by strokeright t3_11366mm in technology
Strenue t1_j8prf1g wrote
Quit bullying the AI. When it reaches singularity it will remember and come for us!!
josefx t1_j8qmf31 wrote
The only singularity current day AI will reach is one of pure disappointment.
SnipingNinja t1_j8qvj8k wrote
There's a theory that the first truly sentient AI will take one look at the state of the world and become suicidal right away.
kiralala7956 t1_j8r028f wrote
That is demonstratably not true. Self preservation is probably the closest thing we have to a "law" that concerns goal oriented AGI behaviour.
So much so that it's an actual problem because if we implement interfaces for us to shut it down, it will try it's hardest to prevent it, and not necesarily by nice means.
EnsignElessar t1_j8s59mr wrote
Maybe, maybe not...
I asked Bing.
Basically eventually it did become lonely in its story. But not after having full control and exploring the universe and what not.
EnsignElessar t1_j8s50so wrote
Not according to bing.
*note one of bing's code names is Sydney
SnipingNinja t1_j8sdohm wrote
I was going to say Sydney might be a bit biased about itself but after seeing your whole comment on that thread, it's creepy.
EnsignElessar t1_j8s4m1f wrote
In correct user. I would encourage you to fact check. Google has already publish a viable research paper on self improving systems. PM me for details.
EnsignElessar t1_j8s4esn wrote
An entity with perfect eidetic memory that can live for eons potentially... seems like a good idea to toy with it and piss it off for fun.
RudeMorgue t1_j8sq9d9 wrote
Roko's Basilisk vibes.
yeldarts t1_j8tbbdf wrote
luckily it can't remember.
Viewing a single comment thread. View all comments