Submitted by Constant-Cranberry29 t3_11mokqu in deeplearning
BamaDane t1_jbjhitr wrote
Reply to comment by neuralbeans in Can feature engineering avoid overfitting? by Constant-Cranberry29
I’m not sure I understand what your method does. If Y is the output, then you say I should also include Y as an input? And if I manage to design my model so it doesn’t just select the Y input, then I’m not overfitting? This makes sense that it doesn’t overfit, but doesn’t it also mean I am dumbing-down my model? Don’t I want my model to preferentially select features that are most similar to the output?
neuralbeans t1_jbjizpw wrote
It's a degenerate case, not something anyone should do. If you include Y in your input, then overfitting will lead to the best generalisation. This shows that the input does affect overfitting. In fact, the more similar the input is to the output, the simpler the model can be and thus the less it can overfit.
Viewing a single comment thread. View all comments