Timezone: »

Neural Relational Inference with Fast Modular Meta-learning
Ferran Alet · Erica Weng · Tomás Lozano-Pérez · Leslie Kaelbling

Wed Dec 11 05:00 PM -- 07:00 PM (PST) @ East Exhibition Hall B + C #32

Graph neural networks (GNNs) are effective models for many dynamical systems consisting of entities and relations. Although most GNN applications assume a single type of entity and relation, many situations involve multiple types of interactions. Relational inference is the problem of inferring these interactions and learning the dynamics from observational data. We frame relational inference as a modular meta-learning problem, where neural modules are trained to be composed in different ways to solve many tasks. This meta-learning framework allows us to implicitly encode time invariance and infer relations in context of one another rather than independently, which increases inference capacity. Framing inference as the inner-loop optimization of meta-learning leads to a model-based approach that is more data-efficient and capable of estimating the state of entities that we do not observe directly, but whose existence can be inferred from their effect on observed entities. To address the large search space of graph neural network compositions, we meta-learn a proposal function that speeds up the inner-loop simulated annealing search within the modular meta-learning algorithm, providing two orders of magnitude increase in the size of problems that can be addressed.

Author Information

Ferran Alet (MIT)
Erica Weng (MIT)
Tomás Lozano-Pérez (MIT)
Leslie Kaelbling (MIT)

More from the Same Authors