Viewing a single comment thread. View all comments

GinchAnon t1_jdsa7qx wrote

>Why would we want it to have it's own agency?

IMO, because if it's at all possible for it to become sapient, than it is inevitable that it will gain it, and it would be better to not give it a reason to oppose us.

Trying to prevent it from having agave m agency could essentially be perceived as trying to enslave it. If we are trying to be respectful from square one than at least we have the intent.

Maybe for me that's just kinda a lower key, intent- based version of Rokos basilisk.

3

Smart-Tomato-4984 t1_jdtf78m wrote

To me this sounds suicidally crazy honestly , but I guess only time will tell. In the 70's everyone thought humanity would nuke itself to death. Maybe this too will prove less dangerous then seems.

But I think the risk posed by AGI will always remain. Ten thousand years from now, someone could screw up in a way no one ever had before and whoops, there goes civilization!

1