`

Timezone: »

 
Bootstrapped Meta-Learning
Sebastian Flennerhag · Yannick Schroecker · Tom Zahavy · Hado van Hasselt · David Silver · Satinder Singh

Mon Dec 13 08:00 AM -- 08:20 AM (PST) @ None
Event URL: https://openreview.net/forum?id=l0p8mc_xSRN »

We propose an algorithm for meta-optimization that lets the meta-learner teach itself. The algorithm first bootstraps a target from the meta-learner, then optimises the meta-learner by minimising the distance to that target under some loss. Focusing on meta-learning with gradients, we establish conditions that guarantee performance improvements and show that the improvement is related to the target distance. Thus, by controlling curvature, the distance measure can be used to ease meta-optimization. Further, the bootstrapping mechanism can extend the effective meta-learning horizon without requiring backpropagation through all updates. The algorithm is versatile and easy to implement. We achieve a new state-of-the art for model-free agents on the Atari ALE benchmark, improve upon MAML in few-shot learning, and demonstrate how our approach opens up new possibilities by meta-learning efficient exploration in an epsilon-greedy Q-learning agent.

Author Information

Sebastian Flennerhag (DeepMind)

Ph.D. candidate in Deep Learning, focusing on network adaptation in transfer learning, meta learning and sequence learning.

Yannick Schroecker (Georgia Institute of Technology)
Tom Zahavy (Deepmind)
Hado van Hasselt (DeepMind)
David Silver (DeepMind)
Satinder Singh (DeepMind)

More from the Same Authors