FirstOrderCat t1_jcsjgws wrote
Reply to comment by thesupernoodle in Best GPUs for pretraining roBERTa-size LLMs with a $50K budget, 4x RTX A6000 v.s. 4x A6000 ADA v.s. 2x A100 80GB by AngrEvv
​
they don't have a6000 ada yet
thesupernoodle t1_jcsll6u wrote
Sure; but the broader point is they can optimize their need with some cheap testing - is the model big enough such that is wants the extra ram of an 80Gig A100?
Viewing a single comment thread. View all comments