Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2022 Workshop on Meta-Learning

Efficient Queries Transformer Neural Processes

Leo Feng · Hossein Hajimirsadeghi · Yoshua Bengio · Mohamed Osama Ahmed


Abstract:

Neural Processes (NPs) are popular methods in meta-learning that can estimate predictive uncertainty on target datapoints by conditioning on a context dataset. Previous state-of-the-art method Transformer Neural Processes (TNPs) achieve strong performance but require quadratic computation with respect to the number of context datapoints per query, limiting its applications. Conversely, existing sub-quadratic NP variants perform significantly worse than that of TNPs. Tackling this issue, we propose Efficient Queries Transformer Neural Processes (EQTNPs), a more computationally efficient NP variant. The model encodes the context dataset into a set of vectors that is linear in the number of context datapoints. When making predictions, the model retrieves higher-order information from the context dataset via multiple cross-attention mechanisms on the context vectors. We empirically show that EQTNPs achieve results competitive with the state-of-the-art.

Chat is not available.