Submitted by seraphaplaca2 t3_122fj05 in MachineLearning
[deleted] t1_jdq3nn5 wrote
[deleted]
tdgros t1_jdqarbq wrote
what's the connection between LoRa and the question about merging weights here?
edit: weird, I saw a notification for an answer from you, but can't see the message...
LoRa is a compression method that replaces weight matrices with low rank approximations for single tasks. It does not merge models or weights
[deleted] t1_jdqc0ax wrote
[removed]
Viewing a single comment thread. View all comments