Submitted by head_robotics t3_1172jrs in MachineLearning
wywywywy t1_j9b2kqu wrote
Reply to comment by xrailgun in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
So, not scientific at all, but I've noticed that checkpoint file size * 0.6 is pretty close to actual VRAM requirement for LLM.
But you're right it'd be nice to have a table handy.
Viewing a single comment thread. View all comments