dwarfarchist9001 t1_j9lb1wl wrote
Reply to comment by gelukuMLG in What. The. ***k. [less than 1B parameter model outperforms GPT 3.5 in science multiple choice questions] by Destiny_Knight
Yes but how many parameters must you actually have to store all the knowledge you realistically need. Maybe a few billion parameters is enough to store the basics of every concept known to man and more specific details can be stored in an external file that the neural net can access with API calls.
gelukuMLG t1_j9lfp3j wrote
You mean like a LoRA?
Viewing a single comment thread. View all comments