Timezone: »
Bayesian optimization has emerged as an exciting subfield of machine learning that is concerned with the global optimization of expensive, noisy, black-box functions using probabilistic methods. Systems implementing Bayesian optimization techniques have been successfully used to solve difficult problems in a diverse set of applications. Many recent advances in the methodologies and theory underlying Bayesian optimization have extended the framework to new applications and provided greater insights into the behaviour of these algorithms. Bayesian optimization is now increasingly being used in industrial settings, providing new and interesting challenges that require new algorithms and theoretical insights.
Classically, Bayesian optimization has been used purely for expensive single-objective black-box optimization. However, with the increased complexity of tasks and applications, this paradigm is proving to be too restricted. Hence, this year’s theme for the workshop will be “black-box optimization and beyond”. Among the recent trends that push beyond BO we can briefly enumerate:
- Adapting BO to not-so-expensive evaluations.
- “Open the black-box” and move away from viewing the model as a way of simply fitting a response surface, and towards modelling for the purpose of discovering and understanding the underlying process. For instance, this so-called grey-box modelling approach could be valuable in robotic applications for optimizing the controller, while simultaneously providing insight into the mechanical properties of the robotic system.
- “Meta-learning”, where a higher level of learning is used on top of BO in order to control the optimization process and make it more efficient. Examples of such meta-learning include learning curve prediction, Freeze-thaw Bayesian optimization, online batch selection, multi-task and multi-fidelity learning.
- Multi-objective optimization where not a single objective, but multiple conflicting objectives are considered (e.g., prediction accuracy vs training time).
The target audience for this workshop consists of both industrial and academic practitioners of Bayesian optimization as well as researchers working on theoretical and practical advances in probabilistic optimization. We expect that this pairing of theoretical and applied knowledge will lead to an interesting exchange of ideas and stimulate an open discussion about the long term goals and challenges of the Bayesian optimization community.
A further goal is to encourage collaboration between the diverse set of researchers involved in Bayesian optimization. This includes not only interchange between industrial and academic researchers, but also between the many different subfields of machine learning which make use of Bayesian optimization or its components. We are also reaching out to the wider optimization and engineering communities for involvement.
Author Information
Roberto Calandra (UC Berkeley)
Bobak Shahriari (University of British Columbia)
Javier Gonzalez (Amazon)
Frank Hutter (University of Freiburg)
Frank Hutter is a Full Professor for Machine Learning at the Computer Science Department of the University of Freiburg (Germany), where he previously was an assistant professor 2013-2017. Before that, he was at the University of British Columbia (UBC) for eight years, for his PhD and postdoc. Frank's main research interests lie in machine learning, artificial intelligence and automated algorithm design. For his 2009 PhD thesis on algorithm configuration, he received the CAIAC doctoral dissertation award for the best thesis in AI in Canada that year, and with his coauthors, he received several best paper awards and prizes in international competitions on machine learning, SAT solving, and AI planning. Since 2016 he holds an ERC Starting Grant for a project on automating deep learning based on Bayesian optimization, Bayesian neural networks, and deep reinforcement learning.
Ryan Adams (Google Brain and Princeton University)
More from the Same Authors
-
2020 Workshop: Machine Learning for Engineering Modeling, Simulation and Design »
Alex Beatson · Priya Donti · Amira Abdel-Rahman · Stephan Hoyer · Rose Yu · J. Zico Kolter · Ryan Adams -
2020 Workshop: 3rd Robot Learning Workshop »
Masha Itkina · Alex Bewley · Roberto Calandra · Igor Gilitschenski · Julien PEREZ · Ransalu Senanayake · Markus Wulfmeier · Vincent Vanhoucke -
2020 Workshop: Meta-Learning »
Jane Wang · Joaquin Vanschoren · Erin Grant · Jonathan Schwarz · Francesco Visin · Jeff Clune · Roberto Calandra -
2020 Poster: On Warm-Starting Neural Network Training »
Jordan Ash · Ryan Adams -
2020 Poster: Task-Agnostic Amortized Inference of Gaussian Process Hyperparameters »
Sulin Liu · Xingyuan Sun · Peter J Ramadge · Ryan Adams -
2020 Poster: Learning Composable Energy Surrogates for PDE Order Reduction »
Alex Beatson · Jordan Ash · Geoffrey Roeder · Tianju Xue · Ryan Adams -
2020 Poster: Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization »
Ben Letham · Roberto Calandra · Akshara Rai · Eytan Bakshy -
2020 Oral: Learning Composable Energy Surrogates for PDE Order Reduction »
Alex Beatson · Jordan Ash · Geoffrey Roeder · Tianju Xue · Ryan Adams -
2020 Poster: 3D Shape Reconstruction from Vision and Touch »
Edward Smith · Roberto Calandra · Adriana Romero · Georgia Gkioxari · David Meger · Jitendra Malik · Michal Drozdzal -
2019 Workshop: Robot Learning: Control and Interaction in the Real World »
Roberto Calandra · Markus Wulfmeier · Kate Rakelly · Sanket Kamthe · Danica Kragic · Stefan Schaal · Markus Wulfmeier -
2019 Workshop: Meta-Learning »
Roberto Calandra · Ignasi Clavera Gilaberte · Frank Hutter · Joaquin Vanschoren · Jane Wang -
2019 Poster: SpArSe: Sparse Architecture Search for CNNs on Resource-Constrained Microcontrollers »
Igor Fedorov · Ryan Adams · Matthew Mattina · Paul Whatmough -
2019 Poster: Discrete Object Generation with Reversible Inductive Construction »
Ari Seff · Wenda Zhou · Farhan Damani · Abigail Doyle · Ryan Adams -
2019 Poster: Meta-Surrogate Benchmarking for Hyperparameter Optimization »
Aaron Klein · Zhenwen Dai · Frank Hutter · Neil Lawrence · Javier González -
2018 Workshop: NIPS 2018 Workshop on Meta-Learning »
Joaquin Vanschoren · Frank Hutter · Sachin Ravi · Jane Wang · Erin Grant -
2018 Poster: Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models »
Kurtland Chua · Roberto Calandra · Rowan McAllister · Sergey Levine -
2018 Spotlight: Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models »
Kurtland Chua · Roberto Calandra · Rowan McAllister · Sergey Levine -
2018 Poster: A Bayesian Nonparametric View on Count-Min Sketch »
Diana Cai · Michael Mitzenmacher · Ryan Adams -
2018 Poster: Maximizing acquisition functions for Bayesian optimization »
James Wilson · Frank Hutter · Marc Deisenroth -
2018 Tutorial: Automatic Machine Learning »
Frank Hutter · Joaquin Vanschoren -
2017 Workshop: Workshop on Meta-Learning »
Roberto Calandra · Frank Hutter · Hugo Larochelle · Sergey Levine -
2017 Workshop: Bayesian optimization for science and engineering »
Ruben Martinez-Cantin · José Miguel Hernández-Lobato · Javier Gonzalez -
2017 Poster: PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference »
Jonathan Huggins · Ryan Adams · Tamara Broderick -
2017 Spotlight: PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference »
Jonathan Huggins · Ryan Adams · Tamara Broderick -
2017 Poster: Reducing Reparameterization Gradient Variance »
Andrew Miller · Nick Foti · Alexander D'Amour · Ryan Adams -
2016 Poster: Bayesian latent structure discovery from multi-neuron recordings »
Scott Linderman · Ryan Adams · Jonathan Pillow -
2016 Poster: Bayesian Optimization with Robust Bayesian Neural Networks »
Jost Tobias Springenberg · Aaron Klein · Stefan Falkner · Frank Hutter -
2016 Oral: Bayesian Optimization with Robust Bayesian Neural Networks »
Jost Tobias Springenberg · Aaron Klein · Stefan Falkner · Frank Hutter -
2016 Poster: Composing graphical models with neural networks for structured representations and fast inference »
Matthew Johnson · David Duvenaud · Alex Wiltschko · Ryan Adams · Sandeep R Datta -
2015 Workshop: Bayesian Optimization: Scalability and Flexibility »
Bobak Shahriari · Ryan Adams · Nando de Freitas · Amar Shah · Roberto Calandra -
2015 Workshop: Statistical Methods for Understanding Neural Systems »
Alyson Fletcher · Jakob H Macke · Ryan Adams · Jascha Sohl-Dickstein -
2015 Poster: Convolutional Networks on Graphs for Learning Molecular Fingerprints »
David Duvenaud · Dougal Maclaurin · Jorge Iparraguirre · Rafael Bombarell · Timothy Hirzel · Alan Aspuru-Guzik · Ryan Adams -
2015 Poster: A Gaussian Process Model of Quasar Spectral Energy Distributions »
Andrew Miller · Albert Wu · Jeffrey Regier · Jon McAuliffe · Dustin Lang · Mr. Prabhat · David Schlegel · Ryan Adams -
2015 Poster: Efficient and Robust Automated Machine Learning »
Matthias Feurer · Aaron Klein · Katharina Eggensperger · Jost Springenberg · Manuel Blum · Frank Hutter -
2015 Poster: Spectral Representations for Convolutional Neural Networks »
Oren Rippel · Jasper Snoek · Ryan Adams -
2015 Poster: Dependent Multinomial Models Made Easy: Stick-Breaking with the Polya-gamma Augmentation »
Scott Linderman · Matthew Johnson · Ryan Adams -
2014 Workshop: Bayesian Optimization in Academia and Industry »
Zoubin Ghahramani · Ryan Adams · Matthew Hoffman · Kevin Swersky · Jasper Snoek -
2014 Poster: A framework for studying synaptic plasticity with neural spike train data »
Scott Linderman · Christopher H Stock · Ryan Adams -
2013 Workshop: Bayesian Optimization in Theory and Practice »
Matthew Hoffman · Jasper Snoek · Nando de Freitas · Michael A Osborne · Ryan Adams · Sebastien Bubeck · Philipp Hennig · Remi Munos · Andreas Krause -
2013 Poster: Multi-Task Bayesian Optimization »
Kevin Swersky · Jasper Snoek · Ryan Adams -
2013 Poster: Message Passing Inference with Chemical Reaction Networks »
Nils E Napp · Ryan Adams -
2013 Oral: Message Passing Inference with Chemical Reaction Networks »
Nils E Napp · Ryan Adams -
2013 Poster: A Determinantal Point Process Latent Variable Model for Inhibition in Neural Spiking Data »
Jasper Snoek · Richard Zemel · Ryan Adams -
2013 Poster: Contrastive Learning Using Spectral Methods »
James Y Zou · Daniel Hsu · David Parkes · Ryan Adams -
2012 Poster: Bayesian n-Choose-k Models for Classification and Ranking »
Kevin Swersky · Daniel Tarlow · Richard Zemel · Ryan Adams · Brendan J Frey -
2012 Poster: Priors for Diversity in Generative Latent Variable Models »
James Y Zou · Ryan Adams -
2012 Poster: Cardinality Restricted Boltzmann Machines »
Kevin Swersky · Daniel Tarlow · Ilya Sutskever · Richard Zemel · Russ Salakhutdinov · Ryan Adams -
2012 Poster: Practical Bayesian Optimization of Machine Learning Algorithms »
Jasper Snoek · Hugo Larochelle · Ryan Adams -
2011 Workshop: Bayesian Nonparametric Methods: Hope or Hype? »
Emily Fox · Ryan Adams -
2010 Workshop: Transfer Learning Via Rich Generative Models. »
Russ Salakhutdinov · Ryan Adams · Josh Tenenbaum · Zoubin Ghahramani · Tom Griffiths -
2010 Workshop: Monte Carlo Methods for Bayesian Inference in Modern Day Applications »
Ryan Adams · Mark A Girolami · Iain Murray -
2010 Oral: Tree-Structured Stick Breaking for Hierarchical Data »
Ryan Adams · Zoubin Ghahramani · Michael Jordan -
2010 Oral: Slice sampling covariance hyperparameters of latent Gaussian models »
Iain Murray · Ryan Adams -
2010 Poster: Tree-Structured Stick Breaking for Hierarchical Data »
Ryan Adams · Zoubin Ghahramani · Michael Jordan -
2010 Poster: Slice sampling covariance hyperparameters of latent Gaussian models »
Iain Murray · Ryan Adams -
2008 Poster: The Gaussian Process Density Sampler »
Ryan Adams · Iain Murray · David MacKay -
2008 Spotlight: The Gaussian Process Density Sampler »
Ryan Adams · Iain Murray · David MacKay