Skip to yearly menu bar Skip to main content


Poster

JAHS-Bench-201: A Foundation For Research On Joint Architecture And Hyperparameter Search

Archit Bansal · Danny Stoll · Maciej Janowski · Arber Zela · Frank Hutter

Hall J (level 1) #1030

Keywords: [ Surrogate Benchmark ] [ Joint Architecture and Hyperparameter Search ] [ Multi-objective ] [ Cost-aware ] [ Neural Architecture Search ] [ multi-fidelity ] [ hyperparameter optimization ]


Abstract:

The past few years have seen the development of many benchmarks for Neural Architecture Search (NAS), fueling rapid progress in NAS research. However, recent work, which shows that good hyperparameter settings can be more important than using the best architecture, calls for a shift in focus towards Joint Architecture and Hyperparameter Search (JAHS). Therefore, we present JAHS-Bench-201, the first collection of surrogate benchmarks for JAHS, built to also facilitate research on multi-objective, cost-aware and (multi) multi-fidelity optimization algorithms. To the best of our knowledge, JAHS-Bench-201 is based on the most extensive dataset of neural network performance data in the public domain. It is composed of approximately 161 million data points and 20 performance metrics for three deep learning tasks, while featuring a 14-dimensional search and fidelity space that extends the popular NAS-Bench-201 space. With JAHS-Bench-201, we hope to democratize research on JAHS and lower the barrier to entry of an extremely compute intensive field, e.g., by reducing the compute time to run a JAHS algorithm from 5 days to only a few seconds.

Chat is not available.