ambient_temp_xeno
ambient_temp_xeno t1_jdcpvhv wrote
Reply to comment by ambient_temp_xeno in [D] Running an LLM on "low" compute power machines? by Qwillbehr
*turns out WSL2 uses half your ram size by default. **13b seems to be weirdly not much better/possibly worse by some accounts anyway.
ambient_temp_xeno t1_jd7fm8a wrote
Reply to comment by Gatensio in [D] Running an LLM on "low" compute power machines? by Qwillbehr
I have the 7b 4bit alpaca.cpp running on my cpu (on virtualized Linux) and also this browser open with 12.3/16GB free. So realistically to use it without taking over your computer I guess 16GB of ram is needed. 8GB wouldn't cut it. I mean, it might fit in 8gb of system ram apparently, especially if it's running natively on Linux. But I haven't tried it. I tried to load the 13b and I couldn't.
ambient_temp_xeno t1_jdmdh2i wrote
Reply to comment by Nyanraltotlapun in [R] Reflexion: an autonomous agent with dynamic memory and self-reflection - Noah Shinn et al 2023 Northeastern University Boston - Outperforms GPT-4 on HumanEval accuracy (0.67 --> 0.88)! by Singularian2501
There is work done on how to even start interacting with an extraterrestrial civilization, and it would probably be a vast amount harder than whatever intelligence is contained in a human-data-filled, human-trained model. https://www.nasa.gov/connect/ebooks/archaeology_anthropology_and_interstellar_communication.html
That said, it is the closest we have to that so you're not 'wrong'.