Viewing a single comment thread. View all comments

[deleted] t1_jdq3nn5 wrote

[deleted]

1

tdgros t1_jdqarbq wrote

what's the connection between LoRa and the question about merging weights here?

edit: weird, I saw a notification for an answer from you, but can't see the message...

LoRa is a compression method that replaces weight matrices with low rank approximations for single tasks. It does not merge models or weights

2