Timezone: »

 
Poster
Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees
Shali Jiang · Daniel Jiang · Maximilian Balandat · Brian Karrer · Jacob Gardner · Roman Garnett

Tue Dec 08 09:00 PM -- 11:00 PM (PST) @ Poster Session 2 #663

Bayesian optimization is a sequential decision making framework for optimizing expensive-to-evaluate black-box functions. Computing a full lookahead policy amounts to solving a highly intractable stochastic dynamic program. Myopic approaches, such as expected improvement, are often adopted in practice, but they ignore the long-term impact of the immediate decision. Existing nonmyopic approaches are mostly heuristic and/or computationally expensive. In this paper, we provide the first efficient implementation of general multi-step lookahead Bayesian optimization, formulated as a sequence of nested optimization problems within a multi-step scenario tree. Instead of solving these problems in a nested way, we equivalently optimize all decision variables in the full tree jointly, in a "one-shot" fashion. Combining this with an efficient method for implementing multi-step Gaussian process "fantasization," we demonstrate that multi-step expected improvement is computationally tractable and exhibits performance superior to existing methods on a wide range of benchmarks.

Author Information

Shali Jiang (Facebook)
Daniel Jiang (Facebook)
Max Balandat (Facebook)
Brian Karrer (Facebook)
Jacob Gardner (University of Pennsylvania)
Roman Garnett (Washington University in St. Louis)

More from the Same Authors