Submitted by GhostingProtocol t3_11k59br in deeplearning
Hi r/deeplearning. I'm very interested in AI and computer vision and I'm building a workstation so I can start doing my own projects. My problem is VRAM. I wanted to buy an AMD card because it's more bang for your bux and you get 16GB of vram. But found out that AMD's software stack sucks so Nvidia seems like the way to go (almost required apparently).
3070ti (8gb vram) was my choice before I found out how important vram was, so 3060ti/3070/3070ti is out of the question. 4000-series seems very overpriced, and anything older than 3000-series seems outdated. That leaves me with 3060 (12gb/3584cc), 3080 (10gb/8704cc). Rtx3080ti+ fall outside my budget.
My thoughts here is 3080 has twice as many cuda cores, but only 10gb ram (Very hard to find 12gb version in my country). The 3060 has more ram, is a lot cheaper. But again, only have half the cuda cores of the 3080.
The obvious answer to me here seem to be to buy the 3080 now, and if vram ever end up being a limitation, I can buy a riser, a 3060, and have a combined 22GB of memory while running the 3060 externally. Does this seem like a good way of solving this problem?
transducer t1_jb5zdo8 wrote
You should be able to do your trade off analysis after reading this:
https://timdettmers.com/2023/01/30/which-gpu-for-deep-learning/