Submitted by Thijs-vW t3_y9xjnh in MachineLearning
Thijs-vW OP t1_it82bgp wrote
Reply to comment by Travolta1984 in [Discussion] Categorical Encoding In Deep Learning by Thijs-vW
I looked in to the embedding layer in Keras, but I was not impressed. They are merely fancy lookup tables. That is nice when you want to encode sentences or the like, but I have a variable with merely 51 categories. In this case, a dense layer to transform the one-hot encoded variable would achieve the same, if I am not mistaken.
TheCloudTamer t1_itau2sm wrote
Embedding are dense layers, just for one-hot vector input.
Thijs-vW OP t1_itbb6xe wrote
Thanks for the clarification. So if I one-hot encode my categorical variable and feed it to a dense layer, I would achieve the same as with an embedding layer?
Thijs-vW OP t1_itbjlyh wrote
Apparently yes: https://stackoverflow.com/a/57807971/15589661
Viewing a single comment thread. View all comments