Skip to yearly menu bar Skip to main content


Tutorial

Advances in Bayesian Optimization

Janardhan Rao Doppa · Virginia Aglietti · Jacob Gardner

Virtual

Abstract:

Many engineering, scientific, and industrial applications including automated machine learning (e.g., hyper-parameter tuning) involve making design choices to optimize one or more expensive to evaluate objectives. Some examples include tuning the knobs of a compiler to optimize performance and efficiency of a set of software programs; designing new materials to optimize strength, elasticity, and durability; and designing hardware to optimize performance, power, and area. Bayesian Optimization (BO) is an effective framework to solve black-box optimization problems with expensive function evaluations. The key idea behind BO is to build a cheap surrogate model (e.g., Gaussian Process) using the real experimental data; and employ it to intelligently select the sequence of function evaluations using an acquisition function, e.g., expected improvement (EI).

The goal of this tutorial is to present recent advances in BO by focusing on challenges, principles, algorithmic ideas and their connections, and important real-world applications. Specifically, we will cover recent work on acqusition functions, BO methods for discrete and hybrid spaces, BO methods for high-dimensional input spaces, causal BO, and key innovations in BoTorch toolbox along with a hands-on demonstration.

Chat is not available.
Schedule