Gatensio t1_jd6rixk wrote on March 22, 2023 at 6:27 AM Reply to comment by KerfuffleV2 in [D] Running an LLM on "low" compute power machines? by Qwillbehr Doesn't 7B parameters require like 12-26GB of RAM depending on precision? How do you run the 30B? Permalink Parent 1
Gatensio t1_jd6rixk wrote
Reply to comment by KerfuffleV2 in [D] Running an LLM on "low" compute power machines? by Qwillbehr
Doesn't 7B parameters require like 12-26GB of RAM depending on precision? How do you run the 30B?