Skip to yearly menu bar Skip to main content


Poster

Meta-Curvature

Eunbyung Park · Junier Oliva

East Exhibition Hall B + C #45

Keywords: [ Optimization for Deep Networks; Optimiza ] [ Algorithms -> AutoML; Algorithms -> Few-Shot Learning; Deep Learning; Deep Learning ] [ Meta-Learning ] [ Algorithms ]


Abstract:

We propose meta-curvature (MC), a framework to learn curvature information for better generalization and fast model adaptation. MC expands on the model-agnostic meta-learner (MAML) by learning to transform the gradients in the inner optimization such that the transformed gradients achieve better generalization performance to a new task. For training large scale neural networks, we decompose the curvature matrix into smaller matrices in a novel scheme where we capture the dependencies of the model's parameters with a series of tensor products. We demonstrate the effects of our proposed method on several few-shot learning tasks and datasets. Without any task specific techniques and architectures, the proposed method achieves substantial improvement upon previous MAML variants and outperforms the recent state-of-the-art methods. Furthermore, we observe faster convergence rates of the meta-training process. Finally, we present an analysis that explains better generalization performance with the meta-trained curvature.

Live content is unavailable. Log in and register to view live content