Comments

You must log in or register to comment.

2muchnet42day t1_jd7upsm wrote

It's awesome. Thank you for your work.

I'd like to know why you didn't take the LoRA approach to finetuning LLaMA? Is a full finetuning better?

2

Nice_Cod7781 t1_jd8avf1 wrote

Why release without the weights? All it does is force people to expend extra energy and time on something that could have been provided originally. It's bad from a cooperative perspective and doesn't help the environment either.

You're not commercializing this so it's not like you're going to get into any legal trouble for releasing the model.

16

radi-cho t1_jd8gdzt wrote

Great. Congratulations. I was planning on attempting the same basically, so thanks for open-sourcing it:)

1