Submitted by Alex-S-S t3_zh69o0 in MachineLearning
Unlikely-Video-663 t1_izktnhx wrote
You might be able to recast the problem to assume the labels are acutally drawn from some distribution and put some simple liklihood function over it -- then learn the parameters of that distribution. This not theoretically sound, you wont capture any epistemic uncertainty, but most of the aleotoric, so dependend on your usecase, it might work.
In practice, use for example a Gaussian likelihood, and learn with a GaussianNLL Loss also the variance. As long as your samples stay within the same distribution, yadaya, this can work okish ..
Otherwise, there are plenty of recalibration techniques to get better results
Equivalent-Way3 t1_izkudc7 wrote
> In practice, use for example a Gaussian likelihood, learn wicht GauddianNLL Loss also the variance. Ax long ad you stay eithin distri yadaya this can work okish ..
You ok?
Viewing a single comment thread. View all comments