Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Meta-Learning

Meta-Learning Initializations for Image Segmentation

Sean Hendryx


Abstract:

We evaluate first-order model agnostic meta-learning algorithms (including FOMAML and Reptile) on few-shot image segmentation, present a novel neural network architecture built for fast learning which we call EfficientLab, and leverage a formal definition of the test error of meta-learning algorithms to decrease error on out of distribution tasks. We show state of the art results on the FSS-1000 dataset by meta-training EfficientLab with FOMAML and using Bayesian optimization to infer the optimal test-time adaptation routine hyperparameters. We also construct a benchmark dataset, binary PASCAL, for the empirical study of how image segmentation meta-learning systems improve as a function of the number of labeled examples. On the binary PASCAL dataset, we show that when generalizing out of meta-distribution, meta-learned initializations provide only a small improvement over joint training in accuracy but require significantly fewer gradient updates. Our code and meta-learned model are available at https://drive.google.com/drive/folders/1VhTJtYQ_byC9woS1fBaRi-hdWksfm5qq?usp=sharing.

Chat is not available.