Submitted by seraphaplaca2 t3_122fj05 in MachineLearning
tdgros t1_jdqarbq wrote
Reply to comment by [deleted] in Is it possible to merge transformers? [D] by seraphaplaca2
what's the connection between LoRa and the question about merging weights here?
edit: weird, I saw a notification for an answer from you, but can't see the message...
LoRa is a compression method that replaces weight matrices with low rank approximations for single tasks. It does not merge models or weights
[deleted] t1_jdqc0ax wrote
[removed]
Viewing a single comment thread. View all comments