Viewing a single comment thread. View all comments

ReginaldIII t1_jb9goco wrote

Link to your code? It needs to be GPLv3 to be compliant with LLama's licensing.

How are you finding the quality of the output? I've had a little play around with the model but wasn't overly impressed. That said, a nice big parameter set like this is a nice test bed for looking at things like pruning methods.

−4

abnormal_human t1_jb9kyzr wrote

Actually, it doesn't. GPLv3 just requires that if OP distributes a binary to someone, the source used to produce that binary is also made available. With server side code the binary isn't being distributed, so no obligation to distribute source.

13

ReginaldIII t1_jb9xlil wrote

Fair enough, I didn't realize that hosting a publicly available service is not the same as distributing.

3

ortegaalfredo OP t1_jbi81mn wrote

I posted the github repo in the original post. The output is bad because Meta's original generator is quite bad. I upgraded it today and its much better now. Still not chatgpt.

1