Viewing a single comment thread. View all comments

Erosis t1_j72rzdl wrote

You'll probably be fine learning transformers directly, but a better understanding of RNNs might make some of the NLP tutorials/papers containing transformers more easily comprehensible.

Attention is an very important component of transformers, but attention can be applied to RNNs, too.

3

SAbdusSamad OP t1_j759v4v wrote

I agree that having a background in RNNs and attention with RNNs can make the learning process for transformers, and by extension ViT, much easier.

1