Submitted by RareGur3157 t3_10mk240 in singularity
EulersApprentice t1_j64xwr8 wrote
Reply to comment by redbucket75 in Superhuman Algorithms could “Kill Everyone” in Due Time, Researchers Warn by RareGur3157
Deploying standard anti-mind-virus.
Roko's Basilisk's threat is null because there's no reason for the Basilisk to follow through with it. If it doesn't exist, it can't do anything. If it does exist, it doesn't need to incentivize its own creation, and can get on with whatever it was going to do anyway. And if you are an AGI developer, you have no need to deliberately configure your AGI to resurrect people and torture them – an AGI that doesn't do that is no less eligible for the title of Singleton.
Inevitable_Snow_8240 t1_j67pduv wrote
It’s such a dumb theory lol
Viewing a single comment thread. View all comments