SlowFourierT198
SlowFourierT198 t1_iyjyu05 wrote
Reply to comment by picardythird in [R] Statistical vs Deep Learning forecasting methods by fedegarzar
By any chance do you have the name or a reference?
SlowFourierT198 t1_itbv7y2 wrote
Reply to comment by MichelMED10 in [D][R] Staking XGBOOST and CNN/Transformer by MichelMED10
XGBOOST is not strictly better than NNs in classification. I can guarantee you as I worked on a classification dataset where a NN performed significantly better then XGBOOST. While your statement will be correct for small sample datasets I am pretty sure, that it will not hold for large datasets with complicated features
SlowFourierT198 t1_j02nin5 wrote
Reply to [P] Are probabilities from multi-label image classification networks calibrated? by alkaway
Depending on the problem you may use Bayesian Neural Networks where you fit a distribution over the weights they are better calibrated but also expensive. There exists some theory on lower cost ways to make the model better calibrated / uncertainty aware. One direction is using Gaussian Process approximations an other is for example PostNet. The overal topic you can search for is uncertainty quantification