Submitted by somebodyenjoy t3_z6kr2n in deeplearning
I currently have a 1080ti GPU. I took slightly more than a year off of deep learning and boom, the market has changed so much. I'm looking for advice on if it'll be better to buy 2 3090 GPUs or 1 4090 GPU. I run into memory limitation issues at times when training big CNN architectures but have always used a lower batch size to compensate for it. I will also be using hyperparameter tuning, so, speed is a factor. How much lower will training speeds on smaller models be if I use 2 3090s? If I can use lower batch sizes, will I be better off going with a 4090? Are there any better options that I missed?
I haven't yet done multi-GPU training, so will I be able to use my current PC in case I do need a bigger memory? This way I'll be able to buy a 4090, and just use this guy for when I need a bigger memory
​
This is my current PC build, I'm guessing I'll have to upgrade almost everything
​
​
--dany-- t1_iy1zq6n wrote
3090 has NVLink bridge to connect two cards to pool memories together. Theoretically you’ll have 2x computing power and 48GB VRAM to do the job. If VRAM size is important for your big model and you have a beefy PSU then this is the way to go. Otherwise just go with a 4090.
If you don’t need to train a model frequently, colab or some paid gpu rental services might be easier for your wallet and power bill. For example it’s only about $2 per hour to rent 4x RTX A6000 from some rentals.