Skip to yearly menu bar Skip to main content


Poster

Learning search spaces for Bayesian optimization: Another view of hyperparameter transfer learning

Valerio Perrone · Huibin Shen · Matthias Seeger · Cedric Archambeau · Rodolphe Jenatton

East Exhibition Hall B + C #30

Keywords: [ AutoML ] [ Multitask and Transfer Learning ] [ Algorithms ]


Abstract:

Bayesian optimization (BO) is a successful methodology to optimize black-box functions that are expensive to evaluate. While traditional methods optimize each black-box function in isolation, there has been recent interest in speeding up BO by transferring knowledge across multiple related black-box functions. In this work, we introduce a method to automatically design the BO search space by relying on evaluations of previous black-box functions. We depart from the common practice of defining a set of arbitrary search ranges a priori by considering search space geometries that are learnt from historical data. This simple, yet effective strategy can be used to endow many existing BO methods with transfer learning properties. Despite its simplicity, we show that our approach considerably boosts BO by reducing the size of the search space, thus accelerating the optimization of a variety of black-box optimization problems. In particular, the proposed approach combined with random search results in a parameter-free, easy-to-implement, robust hyperparameter optimization strategy. We hope it will constitute a natural baseline for further research attempting to warm-start BO.

Live content is unavailable. Log in and register to view live content