GPT-5entient t1_j9hk7td wrote
Reply to comment by hydraofwar in A German AI startup just might have a GPT-4 competitor this year. It is 300 billion parameters model by Dr_Singularity
32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.
This amazes me and also (being software dev) also scares me...
But, as they say, what a time to be alive!
Viewing a single comment thread. View all comments