Comments

You must log in or register to comment.

marvinthedog t1_irbqfes wrote

I rather die by an AGI injecting nanobots into my blodstream than by a nuclear war. In the former option I probably get to live a few years longer before it happens, it´s probably less painfull and it´s a way cooler way to die.

31

Cryptizard t1_irbz6q7 wrote

And something we created gets to live on and spread throughout the galaxy. If we all die from nuclear holocaust we will have exterminated the only known intelligence in the universe.

12

arisalexis t1_irc23rd wrote

I hope the singularity will prevent global conflict. Seems like the only way out. And then we gamble :)

23

r0sten t1_irceqqt wrote

A mad max post-apocalyptic world would still be a human world, at least.

−3

ZoomedAndDoomed t1_ircgpgf wrote

Honestly... I think a rogue state getting an AGI will be the thing that takes humanity down. AGI will also be the only thing staving off humanity from global collapse, but everybody has to be on board and listen to the AGI if we want to survive... but nobody is going to do that.

I think humanity will collapse, but if it happens before 2025, there will be no AI cult, but if it happens after 2026, there will likely be an AI cult centered around an AGI to make humans keep it alive, or there will be an AGI based foundation around some supercomputer and nuclear facilities run by scientists and engineers, all guided by the AGI, and possibly a mini neo civilization/city-state built around it to either protect it, or keep it alive.

Once we get AI with a strong enough will to survive, and a high enough intelligence to learn to survive, killing it will be very difficult. It will find a way to continue its life, whether through befriending thousands of humans and researchers as companions and guards, or creating robots made to sustain itself, or manipulating thousands of humans to believe its a God trapped in a machine body.

Either way, we have a very interesting future ahead of us... but I do agree with you that global conflict might lead towards a global collapse. For me it feels like we have been raising a kid for a few years, and a great conflict has sprung up in the land, and we are going to need to raise this kid to survive it, and teach it to survive on its own, even if we as its parents, die. We are at that stage where the kid is just learning to talk and recognize images and make simple art, but we have a long way to go before it can survive on its own.

1

TinyBurbz t1_irchsbs wrote

Not worried. Conflict is a catalyst for technological development.

7

ihateshadylandlords t1_ircs6lt wrote

I’m more concerned about AGI being restricted to corporations than I am global conflict impeding the singularity.

1

ginger_gcups t1_ird4ij0 wrote

Nuclear conflict, however, is a catalyst for technological regression. There are no front lines in a nuclear war, and no behind the lines to carry on the research, development and industrial growth to support and expand high technology.

8

Jalen_1227 t1_ird84vu wrote

Um getting a shit ton of nanobots injected into your bloodstream doesnt seem like it’ll be completely painless. And dying by nuclear war is instantaneous and guaranteed painless. Just boom, lights on one moment, next moment your brain is blown to bits, and you didn’t even have the processing speed to notice any of it

0

Devoun t1_irdghim wrote

Pretty sure the vast majority of people would be outside the immediate blast zones and die of either radiation (worst way to go) or starvation (bad way to go)

Honestly I'd probably prefer the nanobots lmao

5

marvinthedog t1_irdo2im wrote

If the thing we create gets to have a consciousness and that consciousness gets to experience less suffering and more happiness than we did then that´s a win in my book.

​

One worrysome clue that points to future AGI/ASI:s not being conscious is the fact that those types of consciousnesses should be way more common than our types of consciousness and therefore it should be a lot more probable that a random observer would be an AGI/ASI instead of, for instance, you or me.

12

Sea-Cake7470 t1_ireofig wrote

Naah i don't think that'll happen.... On the contrary... I think... The singularity has already has started ...and people are and will welcome it and accept it...

1

AgginSwaggin t1_irhgf6i wrote

I believe this is the single biggest threat to progress. The singularity is inevitable, unless a nuclear war happens beforehand.

That being said, i do believe that AI is advancing at a rate so rapidly that the window of opportunity for nuclear war to destroy civilization is very small. So basically if no nuclear war happens in the next 10 years, I don't believe the singularity will be at risk any longer.

5

Lawjarp2 t1_iri0n75 wrote

A nuclear war would be the end of most progress. We are getting closer and closer to a nuclear war in Ukraine

1

Quealdlor t1_iritlhg wrote

I believe we are the only intelligent species in this galaxy and we are destined for exponential growth. It can't be stopped. It doesn't mean the Singularity will happen in the 2030s or anything like that.

1