ReasonablyBadass t1_itts24o wrote
Current transformer architecture may need a few more tweaks for AGI to work, but I'd say it's close already.
porcenat_k t1_itu9urc wrote
Indeed. The few tweaks I’d say are continual learning and longer short term memory. Both are active research sub fields. All that’s left to do is scale model size which I consider to be way more than data. Human beings understand basic concepts and don’t need to read the entire internet for that. Because we have evolved bigger brains.
ReasonablyBadass t1_itubdlz wrote
>Human beings understand basic concepts and don’t need to read the entire internet for that.
We have years of training data via multiple high input channels before we reach that level though.
elonmusk12345_ OP t1_itvxck8 wrote
There is a convincing argument that many of the most first principles fundamental things we understand about the world are engrained in us at birth.
A good article: https://www.scientificamerican.com/article/born-ready-babies-are-prewired-to-perceive-the-world/
Viewing a single comment thread. View all comments