Viewing a single comment thread. View all comments

farmingvillein t1_jb18evq wrote

Yes. In the first two lines of the abstract:

> Introduced by Hinton et al. in 2012, dropout has stood the test of time as a regularizer for preventing overfitting in neural networks. In this study, we demonstrate that dropout can also mitigate underfitting when used at the start of training.

50