Submitted by Business-Lead2679 t3_1271po7 in MachineLearning
gliptic t1_jee0fbk wrote
Reply to comment by yehiaserag in [P] Introducing Vicuna: An open-source language model based on LLaMA 13B by Business-Lead2679
Delta weights doesn't mean LoRA. It's just the difference (e.g. XOR) of their new weights and the original weights.
light24bulbs t1_jeeb9cx wrote
Nice way to get around the license problem.
Is Lora really associated with a quality loss? I thought it worked pretty well.
yehiaserag t1_jegqni6 wrote
There are lots of comparisons that show this, this is why ppl created alpaca native, to reach the quality described in the original paper
Viewing a single comment thread. View all comments