Southern-Trip-1102 t1_itwyp8o wrote
Reply to comment by AuspiciousApple in [D] What's the best open source model for GPT3-like text-to-text generation on local hardware? by AuspiciousApple
A bit, as far as I can tell they (the 176B one) are on par with gpt 3. Though I haven't done much testing or comparison. They are also trained on 13 programming and 59 languages from what i read.
AuspiciousApple OP t1_itx00gv wrote
Thanks! Even a qualitative subjective judgement of rough parity is quite encouraging. I might need deepspeed/etc. to get it to run on my 8GB GPU, but if it's even similar quality, that's very cool.
Viewing a single comment thread. View all comments