Viewing a single comment thread. View all comments

Tavrin t1_j9tns0j wrote

If this is true, the context window of GPT is about to do a big leap forward (32k tokens context window instead of the usual 4k or now 8k). Still I agree with you that actual transformers don't feel like they will be the ones taking us all the way to AGI (still there is a of progress that can still be done with them even without more computing power and I'm sure we'll see them used for more and more crazy and useful stuff)

9