Timezone: »

Meta-learning neural architectures, initial weights, hyperparameters, and algorithm components
Frank Hutter

Fri Dec 11 03:11 AM -- 03:36 AM (PST) @

Meta-learning is a powerful set of approaches that promises to replace many components of the deep learning toolbox by learned alternatives, such as learned architectures, optimizers, hyperparameters, and weight initializations. While typical approaches focus on only one of these components at a time, in this talk, I will discuss various efficient approaches for tackling two of them simultaneously. I will also highlight the advantages of not learning complete algorithms from scratch but of rather exploiting the inductive bias of existing algorithms by learning to improve existing algorithms. Finally, I will briefly discuss the connection of meta-learning and benchmarks.

Author Information

Frank Hutter (University of Freiburg & Bosch)

Frank Hutter is a Full Professor for Machine Learning at the Computer Science Department of the University of Freiburg (Germany), where he previously was an assistant professor 2013-2017. Before that, he was at the University of British Columbia (UBC) for eight years, for his PhD and postdoc. Frank's main research interests lie in machine learning, artificial intelligence and automated algorithm design. For his 2009 PhD thesis on algorithm configuration, he received the CAIAC doctoral dissertation award for the best thesis in AI in Canada that year, and with his coauthors, he received several best paper awards and prizes in international competitions on machine learning, SAT solving, and AI planning. Since 2016 he holds an ERC Starting Grant for a project on automating deep learning based on Bayesian optimization, Bayesian neural networks, and deep reinforcement learning.

More from the Same Authors