Submitted by Business-Lead2679 t3_1271po7 in MachineLearning
yehiaserag t1_jed4dee wrote
I'm lost, it says open-source... and I can't see any mentioning of the weights, a download link or a huggingface repo.
On the website it says "We plan to release the model weights by providing a version of delta weights that build on the original LLaMA"
Please no lora for that, lora is always associated with degraded inference quality.
anothererrta t1_jedvpu5 wrote
If you read the blog post, you will actually see the weights mentioned.
gliptic t1_jee0fbk wrote
Delta weights doesn't mean LoRA. It's just the difference (e.g. XOR) of their new weights and the original weights.
light24bulbs t1_jeeb9cx wrote
Nice way to get around the license problem.
Is Lora really associated with a quality loss? I thought it worked pretty well.
yehiaserag t1_jegqni6 wrote
There are lots of comparisons that show this, this is why ppl created alpaca native, to reach the quality described in the original paper
Viewing a single comment thread. View all comments