Viewing a single comment thread. View all comments

tdgros t1_jdqarbq wrote

what's the connection between LoRa and the question about merging weights here?

edit: weird, I saw a notification for an answer from you, but can't see the message...

LoRa is a compression method that replaces weight matrices with low rank approximations for single tasks. It does not merge models or weights

2