Submitted by Qwillbehr t3_11xpohv in MachineLearning
ambient_temp_xeno t1_jd7fm8a wrote
Reply to comment by Gatensio in [D] Running an LLM on "low" compute power machines? by Qwillbehr
I have the 7b 4bit alpaca.cpp running on my cpu (on virtualized Linux) and also this browser open with 12.3/16GB free. So realistically to use it without taking over your computer I guess 16GB of ram is needed. 8GB wouldn't cut it. I mean, it might fit in 8gb of system ram apparently, especially if it's running natively on Linux. But I haven't tried it. I tried to load the 13b and I couldn't.
ambient_temp_xeno t1_jdcpvhv wrote
*turns out WSL2 uses half your ram size by default. **13b seems to be weirdly not much better/possibly worse by some accounts anyway.
Viewing a single comment thread. View all comments