Viewing a single comment thread. View all comments

supernerd321 t1_it46l21 wrote

I think the singularity will happen in a completely unpredictable and unexpected way as well however I predict it will also be anticlimactic almost boring when it happens - "it" being AGI that can self improve leading to ASI.

Why? Because I think we'll be the first ASI... we will augment our working memory capacity with advancement in biotechnology most likely that comes about as part of narrow AI and will immediately increase our fluid intelligence proportional.to what we think of as ASI

So the AGI will be trivial for us to create at that point. We'll realize we can never exceed ourselves by creating something more intelligent because well always be able to keep up by augmenting our brains in same way we've augmented verbal memory through search and smart phones. It'll create a run away race condition between organic and synthetic life forms whose end is never attainable

The singularity by definition will never be realizable as our intelligence increases proportionally in a way that is inseparable from AGI itself

23

AdditionalPizza OP t1_it49ija wrote

I think things that are close enough to AGI in almost every aspect will make enough large scale disruptions to society and humanity. AGI will probably be claimed before true full AGI is developed and at that point it probably won't matter whether or not something is fully an AGI or not. I think these proto-AGI will be much sooner than we are augmenting ourselves. 5 years maybe. Possibly 3 or 4. My answer will probably change in 6 months to a year.

24

mrpimpunicorn t1_it67xpb wrote

There's a hard physical limit to the amount of information processing that can happen in a given volume of space. I'm fairly certain the optimal arrangement of matter within that space will not be biological in nature (at least not eukaryotic)- so there is a hard limit to intelligence for humans that want to remain made out of meat.

8

AdditionalPizza OP t1_it6w0ya wrote

I don't love talking about tech I have no idea about, but I'd argue in that case a sort of cloud computing could be possible. Expanding our brains with a "server" or something.

But that's way beyond anything I know about.

3

Plouw t1_it6x0y8 wrote

>I'm fairly certain the optimal arrangement of matter within that space will not be biological in nature

I guess that's the real question though. We currently do not know the answer, whether or not brains are actually close to optimally using the space in a way classical bit computers cannot. We also do not know if quantum computers are physically capable of it either. At least for all possible operations the classic/quantum/biological computers are doing.

It might be so that a symbiotic relationship between all three is needed, for optimal operations in different sorts of areas where the different types might exceed. I am also aware that this might be me romanticizing/spiritualizing the brains capabilities, but at least it cannot be ruled out - as we do not know the answer.

1

supernerd321 t1_it7sajd wrote

Extremely cool

Any estimate for what the deviation IQ would be for such a limit

Given iq isn't linear scale my guess would be like 1000 which I can't comprehend

1