porcenat_k t1_itu9urc wrote
Reply to comment by ReasonablyBadass in Where does the model accuracy increase due to increasing the model's parameters stop? Is AGI possible by just scaling models with the current transformer architecture? by elonmusk12345_
Indeed. The few tweaks I’d say are continual learning and longer short term memory. Both are active research sub fields. All that’s left to do is scale model size which I consider to be way more than data. Human beings understand basic concepts and don’t need to read the entire internet for that. Because we have evolved bigger brains.
ReasonablyBadass t1_itubdlz wrote
>Human beings understand basic concepts and don’t need to read the entire internet for that.
We have years of training data via multiple high input channels before we reach that level though.
elonmusk12345_ OP t1_itvxck8 wrote
There is a convincing argument that many of the most first principles fundamental things we understand about the world are engrained in us at birth.
A good article: https://www.scientificamerican.com/article/born-ready-babies-are-prewired-to-perceive-the-world/
Viewing a single comment thread. View all comments