Submitted by Zondartul t3_zrbfcr in MachineLearning
yashdes t1_j1420uo wrote
Reply to comment by GoofAckYoorsElf in [D] Running large language models on a home PC? by Zondartul
He's probably referring to Quadros, those things are stupid expensive even in comparison to the 3090/4090
GoofAckYoorsElf t1_j14bxe6 wrote
True, but who needs a Quadro, if a 3090Ti is entirely sufficient?
yashdes t1_j14h1mm wrote
100% agree, love my 3090s, but hope they keep coming down in price so I can get more :D
BelialSirchade t1_j174112 wrote
More vram probably, but you can just hook up 2 3090 ti at half the price
Though for LLM you probably need 10 3090 ti and even then it’s probably not enough
Viewing a single comment thread. View all comments