[D] Which AI model for RTX 3080 10GB? Submitted by SomeGuyInDeutschland t3_11zvtlx on March 23, 2023 at 8:13 PM in MachineLearning 7 comments 0
Civil_Collection7267 t1_jdfogwq wrote on March 24, 2023 at 1:25 AM You can use 4-bit LLaMA 13B or 8-bit LLaMA 7B with the alpaca lora, both are very good. If you need help, this guide explains everything Permalink 2
Viewing a single comment thread. View all comments