Timezone: »

Inductive Logical Query Answering in Knowledge Graphs
Michael Galkin · Zhaocheng Zhu · Hongyu Ren · Jian Tang

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #232

Formulating and answering logical queries is a standard communication interface for knowledge graphs (KGs). Alleviating the notorious incompleteness of real-world KGs, neural methods achieved impressive results in link prediction and complex query answering tasks by learning representations of entities, relations, and queries. Still, most existing query answering methods rely on transductive entity embeddings and cannot generalize to KGs containing new entities without retraining entity embeddings. In this work, we study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities. To this end, we devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs).Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones. Exploring the efficiency--effectiveness trade-off, we find the inductive relational structure representation method generally achieves higher performance, while the inductive node representation method is able to answer complex queries in the inference-only regime without any training on queries and scale to graphs of millions of nodes. Code is available at https://github.com/DeepGraphLearning/InductiveQE

Author Information

Michael Galkin (Mila, McGill University)
Zhaocheng Zhu (Mila - Quebec AI Institute)
Hongyu Ren (Stanford University)
Jian Tang (Mila)

More from the Same Authors