`

Timezone: »

 
Poster
Ensembling geophysical models with Bayesian Neural Networks
Ushnish Sengupta · Matt Amos · Scott Hosking · Carl Edward Rasmussen · Matthew Juniper · Paul Young

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #966

Ensembles of geophysical models improve projection accuracy and express uncertainties. We develop a novel data-driven ensembling strategy for combining geophysical models using Bayesian Neural Networks, which infers spatiotemporally varying model weights and bias while accounting for heteroscedastic uncertainties in the observations. This produces more accurate and uncertainty-aware projections without sacrificing interpretability. Applied to the prediction of total column ozone from an ensemble of 15 chemistry-climate models, we find that the Bayesian neural network ensemble (BayNNE) outperforms existing ensembling methods, achieving a 49.4% reduction in RMSE for temporal extrapolation, and a 67.4% reduction in RMSE for polar data voids, compared to a weighted mean. Uncertainty is also well-characterized, with 90.6% of the data points in our extrapolation validation dataset lying within 2 standard deviations and 98.5% within 3 standard deviations.

Author Information

Ushnish Sengupta (University of Cambridge)

I did my bachelor in Mechanical Engineering from the Indian Institute of Technology, Kharagpur and my masters in computational science from RWTH Aachen University, Germany. As a bachelor student, I worked on the computational modeling of compartment fires and microcombustors. In my masters thesis work, on the other hand, I analyzed data from molecular dynamics simulations and focused on the automatic generation of hidden Markov models to help computational scientists effortlessly derive a simple, concise "states and rates" picture from the massive amount of data they generate. I am a Marie Sklodowska-Curie Early Stage Researcher in the MAGISTER consortium which seeks to utilize machine learning to understand and predict thermoacoustic oscillations in rocket engines, aircraft engines or gas turbines. My job, as I see it, is to serve as a liasion between the probabilistic machine learning group led by my PhD supervisor Professor Carl Rasmussen and the flow instability and adjoint optimization group led by my advisor Professor Matthew Juniper. We are currently looking at data from small-scale combustors in our lab and large-scale combustor data shared by collaborators from German Aerospace Center (DLR) Lampoldshausen, Rolls-Royce Aircraft Engines and General Electric, to explore how ML techniques can use this data to enable both better designs and safe operation for these machines. I am interested in high-stakes applications of probabilistic machine learning techniques. The consequences of failure for an ML algorithm that monitors the sensors of an aircraft engine are very different from one that recognizes faces in social media photos or recommends music. These critical applications often place a high premium on principled uncertainty estimates, applicability to limited datasets and interpretability: something probabilistic machine learning techniques like Gaussian processes can offer. Marrying these completely data-driven techniques with physical modeling for more robust predictions and sensible extrapolations is also something that intrigues me.

Matt Amos (Lancaster University)
Scott Hosking (British Antarctic Survey)
Carl Edward Rasmussen (University of Cambridge)
Matthew Juniper (University of Cambridge)
Paul Young (Lancaster University)

More from the Same Authors