Viewing a single comment thread. View all comments

Silly_Awareness8207 t1_j9s6n8z wrote

If we have AGI, or HLMI as the article calls it, then we have machines that are smart enough to make the next generation of AI, I think. So AGI is enough to trigger the singularity.

3

Mrkvitko t1_j9t4z0f wrote

Is average or below-average human smart enough to make the next generation of AI?

1

Z1BattleBoy21 t1_j9t7n73 wrote

imagine a legion of average humans trained to be ML researchers that can't make human error, and work 24/7; I think they could.

1

BenjaminJamesBush t1_j9tf53e wrote

Oh, totally, yes. Imagine an average human who is willing to learn and work on their goals 24/7. Now as u/Z1BattleBoy21 said imagine an army of such average humans.

1

mobitymosely t1_j9tvcdq wrote

That assumes that ASI is even possible at all. We already have a network of 8 billion people collaborating on projects, and they have one big advantage—access to the real world (eyes, hands, labs, factories). It MIGHT be that there is quite a diminishing return available if you can just model our brains but increase them further in size, speed, and number.

1

Representative_Pop_8 t1_j9vm63x wrote

no, that's not true. to make a next generation of computers you need the cumulative efforts of thousands of engineers, scientists businessmen etc. you could have an ai as smart as two very bright humans and it is unlikely it would on its own develop a better AI.

1

Silly_Awareness8207 t1_j9x17cb wrote

Just have it read books the same way humans do. If it truly is as smart as an average human then it can "stand on the shoulders of giants" just like humans do.

1