Submitted by Numerous_Talk7940 t3_11w9hkj in deeplearning
wally1002 t1_jcx6232 wrote
For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof.
mrcet007 t1_jcxgyl7 wrote
12/16gb is already hitting limit of what's available on market for consumer gaming GPU. Only GPU for deeplearning with more than 16bgb is 4090 which is already out of range for most individual at $1500
[deleted] t1_jcxwdzm wrote
[deleted]
timelyparadox t1_jcy4x02 wrote
Yep you can get a 3090 for 700
Viewing a single comment thread. View all comments