Timezone: »

On the Effectiveness of Bayesian AutoML methods for Physics Emulators
Peetak Mitra · Niccolo Dal Santo · Majid Haghshenas · Shounak Mitra · Conor Daly · David Schmidt

The adoption of Machine Learning (ML) for building emulators for complex physical processes has seen an exponential rise in the recent years. While ML models are good function approximators, optimizing the hyper-parameters of the model to reach a global minimum is not trivial, and often needs human knowledge and expertise. In this light, automatic ML or autoML methods have gained large interest as they automate the process of network hyper-parameter tuning. In addition, Neural Architecture Search (NAS) has shown promising outcomes for improving model performance. While autoML methods have grown in popularity for image, text and other applications, their effectiveness for high-dimensional, complex scientific datasets remains to be investigated. In this work, a data driven emulator for turbulence closure terms in the context of Large Eddy Simulation (LES) models is trained using Artificial Neural Networks and an autoML framework based on Bayesian Optimization, incorporating priors to jointly optimize the hyper-parameters as well as conduct a full neural network architecture search to converge to a global minima, is proposed. Additionally the effect of using different network weight initialization and optimizers such as ADAM, SGDM and RMSProp, are explored. Weight and function space similarities during the optimization trajectory are investigated, and critical differences in the learning process evolution are noted and compared to theory. We observe ADAM optimizer and Glorot initialization consistently performs better, while RMSProp outperforms SGDM as the latter appears to have been stuck at a local optima. Therefore, this autoML BayesOpt framework provides a means to choose the best hyper-parameter settings for a given dataset.

Author Information

Peetak Mitra (Los Alamos National Laboratory)

Computational Physicist at Los Alamos National lab working on advanced machine learning methods for modeling physics problems, including combustion, climate models etc. 4th year PhD student at University of Massachusetts Amherst and co-founder of the ICEnet industry funded consortium that is supported by the likes of NVIDIA, MathWorks, SIEMENS, Cummins, Converge and AVL.

Niccolo Dal Santo (MathWorks, Inc.)
Majid Haghshenas (UMass Amherst)
Shounak Mitra (MathWorks, Inc.)
Conor Daly (MathWorks)
David Schmidt (University of Massachusetts Amherst)

More from the Same Authors

  • 2020 : Machine Learning-based Anomaly Detection with Magnetic Data »
    Peetak Mitra · Denis Akhiyarov · Mauricio Araya-Polo · Daniel Byrd
  • 2022 : ClimFormer - a Spherical Transformer model for long-term climate projections »
    Salva Rühling Cachay · Peetak Mitra · Sookyung Kim · Subhashis Hazarika · Haruki Hirasawa · Dipti Hingmire · Hansi Singh · Kalai Ramea
  • 2022 Workshop: Tackling Climate Change with Machine Learning »
    Peetak Mitra · Maria João Sousa · Mark Roth · Jan Drgona · Emma Strubell · Yoshua Bengio
  • 2019 : Afternoon Coffee Break & Poster Session »
    Heidi Komkov · Stanislav Fort · Zhaoyou Wang · Rose Yu · Ji Hwan Park · Samuel Schoenholz · Taoli Cheng · Ryan-Rhys Griffiths · Chase Shimmin · Surya Karthik Mukkavili · Philippe Schwaller · Christian Knoll · Yangzesheng Sun · Keiichi Kisamori · Gavin Graham · Gavin Portwood · Hsin-Yuan Huang · Paul Novello · Moritz Munchmeyer · Anna Jungbluth · Daniel Levine · Ibrahim Ayed · Steven Atkinson · Jan Hermann · Peter Grönquist · · Priyabrata Saha · Yannik Glaser · Lingge Li · Yutaro Iiyama · Rushil Anirudh · Maciej Koch-Janusz · Vikram Sundar · Francois Lanusse · Auralee Edelen · Jonas Köhler · Jacky H. T. Yip · jiadong guo · Xiangyang Ju · Adi Hanuka · Adrian Albert · Valentina Salvatelli · Mauro Verzetti · Javier Duarte · Eric Moreno · Emmanuel de Bézenac · Athanasios Vlontzos · Alok Singh · Thomas Klijnsma · Brad Neuberg · Paul Wright · Mustafa Mustafa · David Schmidt · Steven Farrell · Hao Sun