LetterRip t1_j3l42en wrote
Reply to comment by learningmoreandmore in [D] I want to use GPT-J-6B for my story-writing project but I have a few questions about it. by learningmoreandmore
You can use the GPT-J-6B 8bit, and can do finetuning on a single GPU with 11 GB of VRAM.
https://huggingface.co/hivemind/gpt-j-6B-8bit
You could probably do a fine tune and test fairly cheaply using google colaboratory or colaboratory pro (9.99$/month).
learningmoreandmore OP t1_j3l4cu0 wrote
Thanks! Would this be able to scale and handle more computations or is this only for personal use? I wonder why most people wouldn't be using this version if it's so efficient computation-wise
Viewing a single comment thread. View all comments