Submitted by Zondartul t3_zrbfcr in MachineLearning
GoofAckYoorsElf t1_j12t661 wrote
A small car. I just bought a new 3090Ti with 24GB VRAM for as little as 1300€. I don't find that overly expensive.
yashdes t1_j1420uo wrote
He's probably referring to Quadros, those things are stupid expensive even in comparison to the 3090/4090
GoofAckYoorsElf t1_j14bxe6 wrote
True, but who needs a Quadro, if a 3090Ti is entirely sufficient?
yashdes t1_j14h1mm wrote
100% agree, love my 3090s, but hope they keep coming down in price so I can get more :D
BelialSirchade t1_j174112 wrote
More vram probably, but you can just hook up 2 3090 ti at half the price
Though for LLM you probably need 10 3090 ti and even then it’s probably not enough
Viewing a single comment thread. View all comments