Submitted by 96suluman t3_zp5the in singularity
Ortus12 t1_j0swkua wrote
Yes.
AGI is a relativistic term as no intelligence is completely general, not even human intelligence. But when people use it they often refer to intelligence that is at least as capable and general as most humans.
When AGI get's to this point it will be vastly more intelligent than humans because of all of the advantageous that it computers already have over human brains. It will be able to re-write it's own code, perform tasks for pay and use that money to buy more server farms, and optimize existing hardware, and buy more solar farms, as well as design more cost effective chips and solar farms.
That is what we refer to as the "intelligence explosion" singularity. It's a feedback loop that starts when AGI reaches human capabilities.
Viewing a single comment thread. View all comments