Submitted by Vegetable-Skill-9700 t3_121a8p4 in MachineLearning
londons_explorer t1_jdo4kj3 wrote
Reply to comment by farmingvillein in [D] Do we really need 100B+ parameters in a large language model? by Vegetable-Skill-9700
Think how many hard drives there are in the world...
All of that data is potential training material.
I think a lot of companies/individuals might give up 'private' data in bulk for ML training if they get a viable benefit from it (for example, having a version of ChatGPT with perfect knowledge of all my friends and neighbours, what they like and do, etc. would be handy)
Viewing a single comment thread. View all comments