Viewing a single comment thread. View all comments

Darustc4 OP t1_je9m2ce wrote

I don't consider myself part of the EY cult, but I must admit that AI progress is getting out of hand and we really do NOT have a plan. Creating a super-intelligent entity with fingers in all pies in the world, and humans having absolutely no control over it, is straight up crazy to me. It could end up working out somehow, but it can also very well devolve in the complete destruction of society.

1

SkyeandJett t1_je9makr wrote

Yeah I'm MUCH more worried about being blown up in WW3 over AI dominance than a malevolent ASI deciding to kill us all.

7

acutelychronicpanic t1_je9mzk9 wrote

The problem is that its impossible. Literally impossible. To enforce this globally unless you actively desire a world war plus an authoritarian surveillance state.

Compact models running on consumer PCs aren't as powerful as SOTA models obviously, but they are getting much better very rapidly. Any group with a few hundred graphics cards may be able to build an AGI at some point in the coming decades.

6

Embarrassed-Bison767 t1_je9y6v3 wrote

If AI won't collapse civilization, the combination of climate change and rappidly diminishing resources leading to a WW III will. Those two things combined have a 100% chance of destroying civilization. AI has a less than 100% chance of doing so. It's the better thing to aim for even with 99.9% certainty of destruction, because destruction with the status quo is garanteed.

5