Submitted by simpleuserhere t3_11usq7o in MachineLearning
pkuba208 t1_jcy717u wrote
Reply to comment by Art10001 in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
I know, but android uses 3-4gb ram itself. I run it myself, so I know that it uses from 6-7 gb of ram on the smallest model currently with 4bit quantization
Art10001 t1_jcy7rqs wrote
Yes, that's why it was tried in a Pixel 7 which has 8 GB of RAM and maybe even swap.
pkuba208 t1_jcy83gf wrote
I use swap too. For now, it can only run on flagships tho. You have to have at least 8gb of ram, because running it directly on let's say 3gb(3gb used by system) ram and 3-5gb SWAP may not even be possible and if it is, then it will be very slow and prone to crashing
Viewing a single comment thread. View all comments