Submitted by Zondartul t3_zrbfcr in MachineLearning
BelialSirchade t1_j174112 wrote
Reply to comment by GoofAckYoorsElf in [D] Running large language models on a home PC? by Zondartul
More vram probably, but you can just hook up 2 3090 ti at half the price
Though for LLM you probably need 10 3090 ti and even then it’s probably not enough
Viewing a single comment thread. View all comments