Timezone: »
Despite the recent progress in hyperparameter optimization (HPO), available benchmarks that resemble real-world scenarios consist of a few and very large problem instances that are expensive to solve. This blocks researchers and practitioners no only from systematically running large-scale comparisons that are needed to draw statistically significant results but also from reproducing experiments that were conducted before. This work proposes a method to alleviate these issues by means of a meta-surrogate model for HPO tasks trained on off-line generated data. The model combines a probabilistic encoder with a multi-task model such that it can generate inexpensive and realistic tasks of the class of problems of interest. We demonstrate that benchmarking HPO methods on samples of the generative model allows us to draw more coherent and statistically significant conclusions that can be reached orders of magnitude faster than using the original tasks. We provide evidence of our findings for various HPO methods on a wide class of problems.
Author Information
Aaron Klein (AWS Berlin)
Zhenwen Dai (Spotify Research)
Frank Hutter (University of Freiburg & Bosch)
Frank Hutter is a Full Professor for Machine Learning at the Computer Science Department of the University of Freiburg (Germany), where he previously was an assistant professor 2013-2017. Before that, he was at the University of British Columbia (UBC) for eight years, for his PhD and postdoc. Frank's main research interests lie in machine learning, artificial intelligence and automated algorithm design. For his 2009 PhD thesis on algorithm configuration, he received the CAIAC doctoral dissertation award for the best thesis in AI in Canada that year, and with his coauthors, he received several best paper awards and prizes in international competitions on machine learning, SAT solving, and AI planning. Since 2016 he holds an ERC Starting Grant for a project on automating deep learning based on Bayesian optimization, Bayesian neural networks, and deep reinforcement learning.
Neil Lawrence (Amazon)
Javier González (Amazon.com)
More from the Same Authors
-
2020 Poster: BOSS: Bayesian Optimization over String Spaces »
Henry Moss · David Leslie · Daniel Beck · Javier González · Paul Rayson -
2020 Poster: Multi-task Causal Learning with Gaussian Processes »
Virginia Aglietti · Theodoros Damoulas · Mauricio Álvarez · Javier González -
2020 Spotlight: BOSS: Bayesian Optimization over String Spaces »
Henry Moss · David Leslie · Daniel Beck · Javier González · Paul Rayson -
2019 Workshop: Meta-Learning »
Roberto Calandra · Ignasi Clavera Gilaberte · Frank Hutter · Joaquin Vanschoren · Jane Wang -
2018 Workshop: NIPS 2018 Workshop on Meta-Learning »
Joaquin Vanschoren · Frank Hutter · Sachin Ravi · Jane Wang · Erin Grant -
2018 Poster: Maximizing acquisition functions for Bayesian optimization »
James Wilson · Frank Hutter · Marc Deisenroth (he/him) -
2018 Tutorial: Automatic Machine Learning »
Frank Hutter · Joaquin Vanschoren -
2017 Workshop: Workshop on Meta-Learning »
Roberto Calandra · Frank Hutter · Hugo Larochelle · Sergey Levine -
2016 Workshop: Bayesian Optimization: Black-box Optimization and Beyond »
Roberto Calandra · Bobak Shahriari · Javier Gonzalez · Frank Hutter · Ryan Adams -
2016 Poster: Bayesian Optimization with Robust Bayesian Neural Networks »
Jost Tobias Springenberg · Aaron Klein · Stefan Falkner · Frank Hutter -
2016 Oral: Bayesian Optimization with Robust Bayesian Neural Networks »
Jost Tobias Springenberg · Aaron Klein · Stefan Falkner · Frank Hutter -
2015 Poster: Efficient and Robust Automated Machine Learning »
Matthias Feurer · Aaron Klein · Katharina Eggensperger · Jost Springenberg · Manuel Blum · Frank Hutter