Poster
HyperDDPM: Estimating Epistemic and Aleatoric Uncertainty with a Single Model
Matthew Chan · Maria Molina · Chris Metzler
East Exhibit Hall A-C #4102
Estimating and disentangling epistemic uncertainty (uncertainty that can be reduced with more training data) and aleatoric uncertainty (uncertainty that is inherent to the task at hand) is critically important when applying machine learning to high-stakes applications such as medical imaging and weather forecasting. Conditional diffusion models' breakthrough ability to accurately and efficiently sample from the posterior distribution of a dataset now makes uncertainty estimation conceptually straightforward: One need only train and sample from a large ensemble of diffusion models. Unfortunately, training such an ensemble becomes computationally intractable as the complexity of the model architecture grows. In this work we introduce a new approach to ensembling, HyperDDPM, which allows one to accurately estimate epistemic and aleatoric uncertainty {with a single model}. Unlike existing single-model uncertainty methods Monte Carlo dropout and Bayesian neural networks, HyperDDPM offers prediction accuracy on-par with, and in some cases superior to, multi-model ensembles. Furthermore, our proposed approach scales to modern network architectures such as Attention U-Net and yields more accurate uncertainty estimates when compared to existing methods. We validate our method on two distinct real-world tasks: x-ray computed tomography reconstruction and weather temperature forecasting.
Live content is unavailable. Log in and register to view live content