Viewing a single comment thread. View all comments

LetterRip t1_j3l42en wrote

You can use the GPT-J-6B 8bit, and can do finetuning on a single GPU with 11 GB of VRAM.

https://huggingface.co/hivemind/gpt-j-6B-8bit

You could probably do a fine tune and test fairly cheaply using google colaboratory or colaboratory pro (9.99$/month).

8

learningmoreandmore OP t1_j3l4cu0 wrote

Thanks! Would this be able to scale and handle more computations or is this only for personal use? I wonder why most people wouldn't be using this version if it's so efficient computation-wise

3

LetterRip t1_j3meu7o wrote

Same license as the 32 bit version so commercial usage is fine (apache-2.0 - see the page for details) Should give similar results and scaling (according to the link above is 1-10% slower inference).

2