Submitted by MichelMED10 t3_y9yuza in MachineLearning
MichelMED10 OP t1_it9d28r wrote
Reply to comment by m98789 in [D][R] Staking XGBOOST and CNN/Transformer by MichelMED10
Yes but the idea is if we train the model end to end. In the end if we feed the extracted features to an XGboost, in the worst case the XGboost will perfrom as good as the classifier. And we will still trained our model end to end prior the the freezing of the encoder. So theorically it seems better
SlowFourierT198 t1_itbv7y2 wrote
XGBOOST is not strictly better than NNs in classification. I can guarantee you as I worked on a classification dataset where a NN performed significantly better then XGBOOST. While your statement will be correct for small sample datasets I am pretty sure, that it will not hold for large datasets with complicated features
Viewing a single comment thread. View all comments