Submitted by flowday t3_10gxy2t in singularity
AsuhoChinami t1_j56ir0n wrote
>AGI occurs across a spectrum: from ‘error prone’, not-too smart or ‘savant’-like (sub-human reliability or intelligence or of limited scope)
This is making the definition so lenient to render it kind of meaningless, I think. Might as well say that Smarterchild was AGI for being smarter than a newborn baby. I think the "floor" for AGI should be something that's competent at almost all intellectual tasks (maybe not fully on par with humans, but competent) and is generally about as smart as a human who's at minimum on the lower end of average. (I think we'll get there during 2023-2024, plz don't kill me techno-skeptics)
Akimbo333 t1_j58h4r0 wrote
Dont worry i wont kill ya lol! Whereas that is very interesting! We don't even have a multimodal AI yet.
Desperate_Excuse1709 t1_j59bcqj wrote
Agi 2050
AsuhoChinami t1_j59bi38 wrote
Okay... thanks.
Viewing a single comment thread. View all comments