Viewing a single comment thread. View all comments

nicholsz t1_j728g2l wrote

OS. Kernel. Bus. Processor. Transistor. p-n junction

15

juanigp t1_j73a6z4 wrote

It was my grain of sand, self attention is a bunch of matrix multiplications. 12 layers of the same, it makes sense to understand why QK^t. If the question would have been how to understand maskrcnn the answer would have been different.

Edit: 12 layers in ViT base / BERT base

0