Submitted by head_robotics t3_1172jrs in MachineLearning
EuphoricPenguin22 t1_j9ceqy4 wrote
Reply to comment by catch23 in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
Yeah, and DDR4 DIMMs are fairly inexpensive as compared to upgrading a GPU for more VRAM.
Viewing a single comment thread. View all comments