Submitted by AutoModerator t3_100mjlp in MachineLearning
trnka t1_j3i3vk4 wrote
Reply to comment by throwaway2676 in [D] Simple Questions Thread by AutoModerator
If your input is only ever a single word, that's right.
Usually people work with texts, or sequences of words. The embedding layer maps the sequence of words to a sequence of embedding vectors. It could be implemented as a sequence of one-hot encodings multiplied by the same W though.
Viewing a single comment thread. View all comments