Viewing a single comment thread. View all comments

thesupernoodle t1_jcsj2iw wrote

For maybe a few hundred bucks, you can test out the exact configurations you want to buy:

https://lambdalabs.com/service/gpu-cloud

You may even decide that you’d rather just cloud compute, as opposed to spending all that money upfront. It would only cost you about 19 K to run 2xA100 in the cloud for 24/365 for a solid year. And that also includes electricity costs.

9

brainhack3r t1_jctctn2 wrote

This is the right answer. Don't guess, test (hey, that rhymed!)

Just make sure your testing mirrors what it would look like to scale upl.

3

FirstOrderCat t1_jcsjgws wrote

​

they don't have a6000 ada yet

1

thesupernoodle t1_jcsll6u wrote

Sure; but the broader point is they can optimize their need with some cheap testing - is the model big enough such that is wants the extra ram of an 80Gig A100?

2