Submitted by AutoModerator t3_11pgj86 in MachineLearning
josejo9423 t1_jcpu2pe wrote
Reply to comment by EcstaticStruggle in [D] Simple Questions Thread by AutoModerator
I would go with 1 but I would no tune early stopping just the number of estimators , xgbboost has the option of stopping iterations (early stopping) when there are no improvements in the metric, if you plot then what model believes and realizes that could have been stopped early , step up that number that you consider before overfitting
EcstaticStruggle t1_jcthdzz wrote
Thanks. This was something I tried earlier. I noticed that using the maximum number of estimators almost always lead to the highest cross validation score. I was worried there would be some overfitting as a result.
Viewing a single comment thread. View all comments