JohnLawsCarriage
JohnLawsCarriage t1_j9wcxns wrote
Reply to comment by norbertus in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
A big NVIDIA card? You'll need at the very least 8, and even still you're not coming close to something like ChatGPT. The computational power required is eye-watering. Check out this open-source GPT2 bot that uses a decentralized network of many people's GPUs. I don't know how many GPUs are on the network exactly, but it's more than 8, and look how slow it is. Remember this is only GPT2 not GPT3 like ChatGPT.
JohnLawsCarriage t1_j9xchqo wrote
Reply to comment by _Bl4ze in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
Oh shit, I just found out how many GPUs they used to train this model here. 288 A100 80GB NVIDIA Tensor core GPUs.
Fuck