Submitted by Zondartul t3_zrbfcr in MachineLearning
GoofAckYoorsElf t1_j14bxe6 wrote
Reply to comment by yashdes in [D] Running large language models on a home PC? by Zondartul
True, but who needs a Quadro, if a 3090Ti is entirely sufficient?
yashdes t1_j14h1mm wrote
100% agree, love my 3090s, but hope they keep coming down in price so I can get more :D
BelialSirchade t1_j174112 wrote
More vram probably, but you can just hook up 2 3090 ti at half the price
Though for LLM you probably need 10 3090 ti and even then it’s probably not enough
Viewing a single comment thread. View all comments