Submitted by ortegaalfredo t3_11kr20f in MachineLearning
ReginaldIII t1_jb9goco wrote
Reply to comment by ortegaalfredo in [R] Created a Discord server with LLaMA 13B by ortegaalfredo
Link to your code? It needs to be GPLv3 to be compliant with LLama's licensing.
How are you finding the quality of the output? I've had a little play around with the model but wasn't overly impressed. That said, a nice big parameter set like this is a nice test bed for looking at things like pruning methods.
abnormal_human t1_jb9kyzr wrote
Actually, it doesn't. GPLv3 just requires that if OP distributes a binary to someone, the source used to produce that binary is also made available. With server side code the binary isn't being distributed, so no obligation to distribute source.
ReginaldIII t1_jb9xlil wrote
Fair enough, I didn't realize that hosting a publicly available service is not the same as distributing.
markasoftware t1_jbdbgm9 wrote
see AGPL, which is more like what you were imagining.
ortegaalfredo OP t1_jbi81mn wrote
I posted the github repo in the original post. The output is bad because Meta's original generator is quite bad. I upgraded it today and its much better now. Still not chatgpt.
Viewing a single comment thread. View all comments