brainhack3r t1_jctctn2 wrote
Reply to comment by thesupernoodle in Best GPUs for pretraining roBERTa-size LLMs with a $50K budget, 4x RTX A6000 v.s. 4x A6000 ADA v.s. 2x A100 80GB by AngrEvv
This is the right answer. Don't guess, test (hey, that rhymed!)
Just make sure your testing mirrors what it would look like to scale upl.
Viewing a single comment thread. View all comments