`

Timezone: »

 
Spotlight
Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels
Massimiliano Patacchiola · Jack Turner · Elliot Crowley · Michael O'Boyle · Amos Storkey

Wed Dec 09 07:20 AM -- 07:30 AM (PST) @ Orals & Spotlights: Continual/Meta/Misc Learning

Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task. Common approaches have taken the form of meta-learning: learning to learn on the new problem given the old. Following the recognition that meta-learning is implementing learning in a multi-level model, we present a Bayesian treatment for the meta-learning inner loop through the use of deep kernels. As a result we can learn a kernel that transfers to new tasks; we call this Deep Kernel Transfer (DKT). This approach has many advantages: is straightforward to implement as a single optimizer, provides uncertainty quantification, and does not require estimation of task-specific parameters. We empirically demonstrate that DKT outperforms several state-of-the-art algorithms in few-shot classification, and is the state of the art for cross-domain adaptation and regression. We conclude that complex meta-learning routines can be replaced by a simpler Bayesian model without loss of accuracy.

Author Information

Massimiliano Patacchiola (University of Edinburgh)

Massimiliano is a postdoctoral researcher at the University of Cambridge in the Machine Learning Group. He is interested in efficient learning (few-shot, self-supervised, meta-learning), Bayesian methods (Gaussian processes), and reinforcement learning. Previously he has been a postdoctoral researcher at the University of Edinburgh and an intern in the Camera Platform team at Snapchat.

Jack Turner (University of Edinburgh)
Elliot Crowley (University of Edinburgh)
Michael O'Boyle (University of Edinburgh)
Amos Storkey (University of Edinburgh)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors