Timezone: »
We propose meta-curvature (MC), a framework to learn curvature information for better generalization and fast model adaptation. MC expands on the model-agnostic meta-learner (MAML) by learning to transform the gradients in the inner optimization such that the transformed gradients achieve better generalization performance to a new task. For training large scale neural networks, we decompose the curvature matrix into smaller matrices in a novel scheme where we capture the dependencies of the model's parameters with a series of tensor products. We demonstrate the effects of our proposed method on several few-shot learning tasks and datasets. Without any task specific techniques and architectures, the proposed method achieves substantial improvement upon previous MAML variants and outperforms the recent state-of-the-art methods. Furthermore, we observe faster convergence rates of the meta-training process. Finally, we present an analysis that explains better generalization performance with the meta-trained curvature.
Author Information
Eunbyung Park (UNC Chapel Hill / Nuro)
Junier Oliva (UNC - Chapel Hill)
More from the Same Authors
-
2020 Poster: Exchangeable Neural ODE for Set Modeling »
Yang Li · Haidong Yi · Christopher Bender · Siyuan Shan · Junier Oliva -
2020 Poster: Meta-Neighborhoods »
Siyuan Shan · Yang Li · Junier Oliva -
2019 Workshop: Sets and Partitions »
Nicholas Monath · Manzil Zaheer · Andrew McCallum · Ari Kobren · Junier Oliva · Barnabas Poczos · Ruslan Salakhutdinov