Submitted by imgonnarelph t3_11wqmga in MachineLearning
Educational-Net303 t1_jd2rsax wrote
Reply to comment by 42gether in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
My whole point is that it will take years before we get to 48GB vram consumer GPUs. You just proved my point again without even reading it.
[deleted] t1_jd306w6 wrote
[removed]
Viewing a single comment thread. View all comments