Submitted by head_robotics t3_1172jrs in MachineLearning
Baeocystin t1_j9e6s12 wrote
Reply to comment by Last-Belt-4010 in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
The tl;dr for all GPU questions is that CUDA is the answer. There are no other even 'kinda' contenders.
I'm not happy about the monopoly, but that's where we're at, and there is nothing on the horizon pointing otherwise, either.
Viewing a single comment thread. View all comments