Timezone: »

ReFactor GNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective
Yihong Chen · Pushkar Mishra · Luca Franceschi · Pasquale Minervini · Pontus Lars Erik Saito Stenetorp · Sebastian Riedel

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #914
Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactor GNNs. This new architecture draws upon $\textit{both}$ modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactor GNNs. Across a multitude of well-established KGC benchmarks, our ReFactor GNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.

Author Information

Yihong Chen (University College London, Meta AI)

I connect things using relational learning, language models and imagination.

Pushkar Mishra (Facebook AI)
Luca Franceschi (Amazon Development Center Germany)
Pasquale Minervini (University College London)
Pontus Lars Erik Saito Stenetorp (University of Tokyo)
Sebastian Riedel (DeepMind / UCL)

More from the Same Authors