Timezone: »
Ensembles of geophysical models improve projection accuracy and express uncertainties. We develop a novel data-driven ensembling strategy for combining geophysical models using Bayesian Neural Networks, which infers spatiotemporally varying model weights and bias while accounting for heteroscedastic uncertainties in the observations. This produces more accurate and uncertainty-aware projections without sacrificing interpretability. Applied to the prediction of total column ozone from an ensemble of 15 chemistry-climate models, we find that the Bayesian neural network ensemble (BayNNE) outperforms existing ensembling methods, achieving a 49.4% reduction in RMSE for temporal extrapolation, and a 67.4% reduction in RMSE for polar data voids, compared to a weighted mean. Uncertainty is also well-characterized, with 90.6% of the data points in our extrapolation validation dataset lying within 2 standard deviations and 98.5% within 3 standard deviations.
Author Information
Ushnish Sengupta (University of Cambridge)
I did my bachelor in Mechanical Engineering from the Indian Institute of Technology, Kharagpur and my masters in computational science from RWTH Aachen University, Germany. As a bachelor student, I worked on the computational modeling of compartment fires and microcombustors. In my masters thesis work, on the other hand, I analyzed data from molecular dynamics simulations and focused on the automatic generation of hidden Markov models to help computational scientists effortlessly derive a simple, concise "states and rates" picture from the massive amount of data they generate. I am a Marie Sklodowska-Curie Early Stage Researcher in the MAGISTER consortium which seeks to utilize machine learning to understand and predict thermoacoustic oscillations in rocket engines, aircraft engines or gas turbines. My job, as I see it, is to serve as a liasion between the probabilistic machine learning group led by my PhD supervisor Professor Carl Rasmussen and the flow instability and adjoint optimization group led by my advisor Professor Matthew Juniper. We are currently looking at data from small-scale combustors in our lab and large-scale combustor data shared by collaborators from German Aerospace Center (DLR) Lampoldshausen, Rolls-Royce Aircraft Engines and General Electric, to explore how ML techniques can use this data to enable both better designs and safe operation for these machines. I am interested in high-stakes applications of probabilistic machine learning techniques. The consequences of failure for an ML algorithm that monitors the sensors of an aircraft engine are very different from one that recognizes faces in social media photos or recommends music. These critical applications often place a high premium on principled uncertainty estimates, applicability to limited datasets and interpretability: something probabilistic machine learning techniques like Gaussian processes can offer. Marrying these completely data-driven techniques with physical modeling for more robust predictions and sensible extrapolations is also something that intrigues me.
Matt Amos (Lancaster University)
Scott Hosking (British Antarctic Survey)
Carl Edward Rasmussen (University of Cambridge)
Matthew Juniper (University of Cambridge)
Paul Young (Lancaster University)
More from the Same Authors
-
2022 : Identifying latent climate signals using sparse hierarchical Gaussian processes »
Matt Amos · Thomas Pinder · Paul Young -
2022 : Ice Core Dating using Probabilistic Programming »
Aditya Ravuri · Tom Andersson · Ieva Kazlauskaite · William Tebbutt · Richard Turner · Scott Hosking · Neil Lawrence · Markus Kaiser -
2022 : Active Learning with Convolutional Gaussian Neural Processes for Environmental Sensor Placement »
Tom Andersson · Wessel Bruinsma · Efstratios Markou · Daniel C. Jones · Scott Hosking · James Requeima · Anna Vaughan · Anna-Louise Ellis · Matthew Lazzara · Richard Turner -
2022 : Multi-fidelity experimental design for ice-sheet simulation »
Pierre Thodoroff · Markus Kaiser · Rosie Williams · Robert Arthern · Scott Hosking · Neil Lawrence · Ieva Kazlauskaite -
2022 : Gaussian Process parameterized Covariance Kernels for Non-stationary Regression »
Vidhi Lalchand · Talay Cheema · Laurence Aitchison · Carl Edward Rasmussen -
2022 : Identifying latent climate signals using sparse hierarchical Gaussian processes »
Matt Amos · Thomas Pinder · Paul Young -
2022 : Bayesian parameter inference of a vortically perturbed flame model for the prediction of thermoacoustic instability »
Max Croci · Joel Vasanth · Ushnish Sengupta · Ekrem Ekici · Matthew Juniper -
2022 Poster: Sparse Gaussian Process Hyperparameters: Optimize or Integrate? »
Vidhi Lalchand · Wessel Bruinsma · David Burt · Carl Edward Rasmussen -
2021 Poster: Kernel Identification Through Transformers »
Fergus Simpson · Ian Davies · Vidhi Lalchand · Alessandro Vullo · Nicolas Durrande · Carl Edward Rasmussen -
2021 Poster: Marginalised Gaussian Processes with Nested Sampling »
Fergus Simpson · Vidhi Lalchand · Carl Edward Rasmussen -
2020 : Combining variational autoencoder representations with structural descriptors improves prediction of docking scores »
Miguel Garcia-Ortegon · Carl Edward Rasmussen · Hiroshi Kajino -
2017 Poster: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Oral: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Poster: Data-Efficient Reinforcement Learning in Continuous State-Action Gaussian-POMDPs »
Rowan McAllister · Carl Edward Rasmussen -
2016 Poster: Understanding Probabilistic Sparse Gaussian Process Approximations »
Matthias Bauer · Mark van der Wilk · Carl Edward Rasmussen -
2014 Poster: Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models »
Yarin Gal · Mark van der Wilk · Carl Edward Rasmussen -
2014 Poster: Variational Gaussian Process State-Space Models »
Roger Frigola · Yutian Chen · Carl Edward Rasmussen -
2013 Poster: Bayesian Inference and Learning in Gaussian Process State-Space Models with Particle MCMC »
Roger Frigola · Fredrik Lindsten · Thomas Schön · Carl Edward Rasmussen -
2012 Poster: Active Learning of Model Evidence Using Bayesian Quadrature »
Michael A Osborne · David Duvenaud · Roman Garnett · Carl Edward Rasmussen · Stephen J Roberts · Zoubin Ghahramani -
2011 Poster: Gaussian Process Training with Input Noise »
Andrew McHutchon · Carl Edward Rasmussen -
2011 Poster: Additive Gaussian Processes »
David Duvenaud · Hannes Nickisch · Carl Edward Rasmussen -
2009 Workshop: Probabilistic Approaches for Control and Robotics »
Marc Deisenroth · Hilbert J Kappen · Emo Todorov · Duy Nguyen-Tuong · Carl Edward Rasmussen · Jan Peters -
2006 Tutorial: Advances in Gaussian Processes »
Carl Edward Rasmussen