Submitted by radi-cho t3_11izjc1 in MachineLearning
Chadssuck222 t1_jb15xxk wrote
Noob question: why title this research as ‘reducing under-fitting’ and not as ‘improving fitting of the data’?
Toast119 t1_jb16zt9 wrote
I think it's because Dropout is usually seen as a method for reducing overfitting and this paper is claiming and supporting that it is also useful for reducing underfittting as well.
farmingvillein t1_jb18evq wrote
Yes. In the first two lines of the abstract:
> Introduced by Hinton et al. in 2012, dropout has stood the test of time as a regularizer for preventing overfitting in neural networks. In this study, we demonstrate that dropout can also mitigate underfitting when used at the start of training.
[deleted] t1_jb19gqq wrote
[deleted]
BrotherAmazing t1_jb37vx3 wrote
It’s sort of a “clickbait” title I didn’t like myself even if it’s a potentially interesting paper.
Usually we assume dropout helps prevent overfitting, not help with underfitting, but the thing I don’t like about the title is it makes it sound like dropout helps with underfitting in general. It does not and they don’t even claim it does—even by the time you finish reading their Abstract you can tell that they’re only saying dropout has been observed to help with underfitting in certain circumstances when used in certain ways only.
I can come up with low dimensional counter-examples where dropout won’t help you when you’re underfitting, and will necessarily be the cause of the underfitting for example.
[deleted] t1_jb1vdtd wrote
[removed]
ElleLeonne t1_jb16hic wrote
Maybe it hurts generalization? ie, causes overfitting?
There could even be a second paper in the works to address this question
[deleted] t1_jb2acxm wrote
[deleted]
Viewing a single comment thread. View all comments