Timezone: »
Recent advances have extended the scope of Bayesian optimization (BO) to expensive-to-evaluate black-box functions with dozens of dimensions, aspiring to unlock impactful applications, for example, in the life sciences, neural architecture search, and robotics. However, a closer examination reveals that the state-of-the-art methods for high-dimensional Bayesian optimization (HDBO) suffer from degrading performance as the number of dimensions increases, or even risk failure if certain unverifiable assumptions are not met. This paper proposes BAxUS that leverages a novel family of nested random subspaces to adapt the space it optimizes over to the problem. This ensures high performance while removing the risk of failure, which we assert via theoretical guarantees. A comprehensive evaluation demonstrates that BAxUS achieves better results than the state-of-the-art methods for a broad set of applications.
Author Information
Leonard Papenmeier (Lund University)
Luigi Nardi (Lund University and Stanford University)
Matthias Poloczek (Amazon)
Matthias’ research interests lie at the intersection of machine learning and optimization, with a focus on Bayesian methods for "exotic" optimization problems arising in business applications and in the natural sciences. He is a Principled Scientist at Amazon. Previously, Matthias was a Senior Manager at Uber AI, where he founded Uber’s Bayesian optimization team and led the cross-org effort that built a company-wide service to tune ML models at scale. Matthias received his PhD in CS from Goethe University in Frankfurt in 2013 and then worked as a postdoc at Cornell with David Williamson and Peter Frazier from 2014 until 2017. He was an Assistant Professor in the Department of Systems and Industrial Engineering at the University of Arizona from 2017 until 2019.
More from the Same Authors
-
2022 : PriorBand: HyperBand + Human Expert Knowledge »
Neeratyoy Mallik · Carl Hvarfner · Danny Stoll · Maciej Janowski · Edward Bergman · Marius Lindauer · Luigi Nardi · Frank Hutter -
2023 Poster: PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning »
Neeratyoy Mallik · Carl Hvarfner · Edward Bergman · Danny Stoll · Maciej Janowski · Marius Lindauer · Luigi Nardi · Frank Hutter -
2023 Poster: Self-Correcting Bayesian Optimization through Bayesian Active Learning »
Carl Hvarfner · Erik Hellsten · Frank Hutter · Luigi Nardi -
2023 Poster: Bounce: a Reliable Bayesian Optimization Algorithm for Combinatorial and Mixed Spaces »
Leonard Papenmeier · Luigi Nardi · Matthias Poloczek -
2022 Poster: Joint Entropy Search For Maximally-Informed Bayesian Optimization »
Carl Hvarfner · Frank Hutter · Luigi Nardi -
2019 Poster: Scalable Global Optimization via Local Bayesian Optimization »
David Eriksson · Michael Pearce · Jacob Gardner · Ryan Turner · Matthias Poloczek -
2019 Spotlight: Scalable Global Optimization via Local Bayesian Optimization »
David Eriksson · Michael Pearce · Jacob Gardner · Ryan Turner · Matthias Poloczek