SlowFourierT198

SlowFourierT198 t1_j02nin5 wrote

Depending on the problem you may use Bayesian Neural Networks where you fit a distribution over the weights they are better calibrated but also expensive. There exists some theory on lower cost ways to make the model better calibrated / uncertainty aware. One direction is using Gaussian Process approximations an other is for example PostNet. The overal topic you can search for is uncertainty quantification

4

SlowFourierT198 t1_itbv7y2 wrote

XGBOOST is not strictly better than NNs in classification. I can guarantee you as I worked on a classification dataset where a NN performed significantly better then XGBOOST. While your statement will be correct for small sample datasets I am pretty sure, that it will not hold for large datasets with complicated features

1