Submitted by fangfried t3_11alcys in singularity
Tavrin t1_j9tns0j wrote
Reply to comment by nul9090 in What are the big flaws with LLMs right now? by fangfried
If this is true, the context window of GPT is about to do a big leap forward (32k tokens context window instead of the usual 4k or now 8k). Still I agree with you that actual transformers don't feel like they will be the ones taking us all the way to AGI (still there is a of progress that can still be done with them even without more computing power and I'm sure we'll see them used for more and more crazy and useful stuff)
Viewing a single comment thread. View all comments