Timezone: »

 
Poster
JAHS-Bench-201: A Foundation For Research On Joint Architecture And Hyperparameter Search
Archit Bansal · Danny Stoll · Maciej Janowski · Arber Zela · Frank Hutter

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #1030

The past few years have seen the development of many benchmarks for Neural Architecture Search (NAS), fueling rapid progress in NAS research. However, recent work, which shows that good hyperparameter settings can be more important than using the best architecture, calls for a shift in focus towards Joint Architecture and Hyperparameter Search (JAHS). Therefore, we present JAHS-Bench-201, the first collection of surrogate benchmarks for JAHS, built to also facilitate research on multi-objective, cost-aware and (multi) multi-fidelity optimization algorithms. To the best of our knowledge, JAHS-Bench-201 is based on the most extensive dataset of neural network performance data in the public domain. It is composed of approximately 161 million data points and 20 performance metrics for three deep learning tasks, while featuring a 14-dimensional search and fidelity space that extends the popular NAS-Bench-201 space. With JAHS-Bench-201, we hope to democratize research on JAHS and lower the barrier to entry of an extremely compute intensive field, e.g., by reducing the compute time to run a JAHS algorithm from 5 days to only a few seconds.

Author Information

Archit Bansal (Albert-Ludwigs University of Freiburg)
Danny Stoll (University of Freiburg)
Danny Stoll

First doctoral candidate at the engineering faculty of the University of Freiburg admitted directly after the B.Sc. Working on AutoML, Deep Learning, Neural Architecture Search, Hyperparameter Optimization, Joint Architecture and Hyperparameter Search, Meta-Learning

Maciej Janowski (Albert-Ludwigs-Universität Freiburg)
Arber Zela (University of Freiburg)
Frank Hutter (University of Freiburg & Bosch)

Frank Hutter is a Full Professor for Machine Learning at the Computer Science Department of the University of Freiburg (Germany), where he previously was an assistant professor 2013-2017. Before that, he was at the University of British Columbia (UBC) for eight years, for his PhD and postdoc. Frank's main research interests lie in machine learning, artificial intelligence and automated algorithm design. For his 2009 PhD thesis on algorithm configuration, he received the CAIAC doctoral dissertation award for the best thesis in AI in Canada that year, and with his coauthors, he received several best paper awards and prizes in international competitions on machine learning, SAT solving, and AI planning. Since 2016 he holds an ERC Starting Grant for a project on automating deep learning based on Bayesian optimization, Bayesian neural networks, and deep reinforcement learning.

More from the Same Authors