Invited Talk
in
Competition: FAIR Universe – The Challenge of Handling Uncertainties in Fundamental Science
1st Place Competition Milestone (Ensembles and Uncertainty Quantification)
Ibrahim Elsharkawy
This talk will cover two projects, the first: how ensembles of normalizing flows can be used to build classifiers robust to systematic uncertainties, thus enabling simple uncertainty quantification for a given problem and great performance on the Higgs Uncertainty Challenge. I will introduce a novel normalizing flow loss in addition to a procedure to train and build an analysis pipeline capable of accurately estimating both a desired measurement (for example the signal fraction) and 68% confidence intervals.
The second project takes a different perspective, where we tackle directly computing ensemble uncertainty, i.e. quantifying the variance of an ensemble of Deep Neural Networks (DNNs). We propose, in conjugation with infinite-width networks, to exploit scaling laws - an apparently ubiquitous phenomenon in deep learning where the test loss of a DNN (with some task-dependent scaling exponent) follows a power law with training set size. We find a potential “invariant” in Neural Network Ensemble statistics in both infinite-width and finite-width networks which may be directly useful for Uncertainty Quantification.
Live content is unavailable. Log in and register to view live content