Viewing a single comment thread. View all comments

drewfurlong t1_j0a3mbn wrote

Have you come across any excellent explanations for how the various attention layers work? Ideally with worked examples and graphics.

After reading the wikipedia article on the topic, and 6 pages into attention is all you need, I'm thoroughly confused. it's tough to keep track of what's a key, query, and value, where the recurrent layer goes, etc.

1