WH7EVR

WH7EVR t1_jbngk56 wrote

It took about 120 GPU-years (A100 80GB) to train LLaMA. If you want to train it from scratch, it'll cost you a ton of money and/or time. That said, you can fine-tune llama as-is. No real point is recreating it.

1