Submitted by austintackaberry t3_120usfk in MachineLearning
throwaway2676 t1_jdl0y80 wrote
Reply to comment by mxby7e in [R] Hello Dolly: Democratizing the magic of ChatGPT with open models by austintackaberry
Alpaca was only trained on 50k instructions, right? A large group of grad students or a forum like reddit could construct that many manually in a couple weeks. I'm surprised they even had to resort to using ClosedAI
mxby7e t1_jdl18t6 wrote
Maybe, open assistant by Stability.ai is doing this type of manual dataset collection. The training data and the model weights are supposed to be released once training is complete
Viewing a single comment thread. View all comments