Timezone: »
Expected improvement and other acquisition functions widely used in Bayesian optimization use a "one-step" assumption: they value objective function evaluations assuming no future evaluations will be performed. Because we usually evaluate over multiple steps, this assumption may leave substantial room for improvement. Existing theory gives acquisition functions looking multiple steps in the future but calculating them requires solving a high-dimensional continuous-state continuous-action Markov decision process (MDP). Fast exact solutions of this MDP remain out of reach of today's methods. As a result, previous two- and multi-step lookahead Bayesian optimization algorithms are either too expensive to implement in most practical settings or resort to heuristics that may fail to fully realize the promise of two-step lookahead. This paper proposes a computationally efficient algorithm that provides an accurate solution to the two-step lookahead Bayesian optimization problem in seconds to at most several minutes of computation per batch of evaluations. The resulting acquisition function provides increased query efficiency and robustness compared with previous two- and multi-step lookahead methods in both single-threaded and batch experiments. This unlocks the value of two-step lookahead in practice. We demonstrate the value of our algorithm with extensive experiments on synthetic test functions and real-world problems.
Author Information
Jian Wu (Cornell University)
Peter Frazier (Cornell / Uber)
Peter Frazier is an Associate Professor in the School of Operations Research and Information Engineering at Cornell University, and a Staff Data Scientist at Uber. He received a Ph.D. in Operations Research and Financial Engineering from Princeton University in 2009. His research is at the intersection of machine learning and operations research, focusing on Bayesian optimization, multi-armed bandits, active learning, and Bayesian nonparametric statistics. He is an associate editor for Operations Research, ACM TOMACS, and IISE Transactions, and is the recipient of an AFOSR Young Investigator Award and an NSF CAREER Award.
More from the Same Authors
-
2021 Poster: Constrained Two-step Look-Ahead Bayesian Optimization »
Yunxiang Zhang · Xiangyu Zhang · Peter Frazier -
2021 Poster: Multi-Step Budgeted Bayesian Optimization with Unknown Evaluation Costs »
Raul Astudillo · Daniel Jiang · Maximilian Balandat · Eytan Bakshy · Peter Frazier -
2021 Poster: Bayesian Optimization of Function Networks »
Raul Astudillo · Peter Frazier -
2020 Poster: Bayesian Optimization of Risk Measures »
Sait Cakmak · Raul Astudillo · Peter Frazier · Enlu Zhou -
2017 : Invited talk: Knowledge Gradient Methods for Bayesian Optimization »
Peter Frazier -
2017 Poster: Multi-Information Source Optimization »
Matthias Poloczek · Jialei Wang · Peter Frazier -
2017 Spotlight: Multi-Information Source Optimization »
Matthias Poloczek · Jialei Wang · Peter Frazier -
2017 Poster: Bayesian Optimization with Gradients »
Jian Wu · Matthias Poloczek · Andrew Wilson · Peter Frazier -
2017 Oral: Bayesian Optimization with Gradients »
Jian Wu · Matthias Poloczek · Andrew Wilson · Peter Frazier -
2016 Poster: The Parallel Knowledge Gradient Method for Batch Bayesian Optimization »
Jian Wu · Peter Frazier