Submitted by Tea_Pearce t3_10aq9id in MachineLearning
gdiamos t1_j4a96pu wrote
Reply to comment by mugbrushteeth in [D] Bitter lesson 2.0? by Tea_Pearce
Currently we have exascale computers, e.g. 1e18 flops at around 50e6 watts.
The power output of the sun is about 4e26 watts. That's 20 orders of magnitude on the table.
This paper claims that energy of computation can theoretically be reduced by another 22 orders of magnitude. https://arxiv.org/pdf/quant-ph/9908043.pdf
So physics (our current understanding) seems to allow at least 42 orders of magnitude bigger (computationally) learning machines than current generation foundation models, without leaving this solar system, and without converting mass into energy...
Viewing a single comment thread. View all comments