Submitted by Tea_Pearce t3_10aq9id in MachineLearning
BarockMoebelSecond t1_j48mepq wrote
Reply to comment by RandomCandor in [D] Bitter lesson 2.0? by Tea_Pearce
Which is and has been the Status Quo for the entire history of computing, I don't see how that's a new development?
currentscurrents t1_j490rvn wrote
It's meaningful right now because there's a threshold where LLMs become awesome, but getting there requires expensive specialized GPUs.
I'm hoping in a few years consumer GPUs will have 80GB of VRAM or whatever and we'll be able to run them locally. While datacenters will still have more compute, it won't matter as much since there's a limit where larger models would require more training data than exists.
Viewing a single comment thread. View all comments