A deployed model has epistemic or aleatoric uncertainty?
Aleatoric uncertainty refers to the notion of randomness that there is in the outcome of an experiment that is due to inherently random effects.
Epistemic uncertainty refers to the ignorance of the decision-maker, due to for example lack of data.
Aleatoric uncertainty is irreducible, while epistemic can be mitigated (Adding more data).
When we deploy a ML model in production. Can we distinguish between epistemic and aleatoric uncertainy?
Topic uncertainty statistics machine-learning
Category Data Science