Viewing a single comment thread. View all comments

MichelMED10 OP t1_it9d28r wrote

Yes but the idea is if we train the model end to end. In the end if we feed the extracted features to an XGboost, in the worst case the XGboost will perfrom as good as the classifier. And we will still trained our model end to end prior the the freezing of the encoder. So theorically it seems better

−5

SlowFourierT198 t1_itbv7y2 wrote

XGBOOST is not strictly better than NNs in classification. I can guarantee you as I worked on a classification dataset where a NN performed significantly better then XGBOOST. While your statement will be correct for small sample datasets I am pretty sure, that it will not hold for large datasets with complicated features

1