Skip to yearly menu bar Skip to main content


Poster

Inductive Logical Query Answering in Knowledge Graphs

Michael Galkin · Zhaocheng Zhu · Hongyu Ren · Jian Tang

Hall J (level 1) #232

Keywords: [ inductive representation learning ] [ inductive graph reasoning ] [ logical queries ] [ complex query answering ] [ graph neural networks ] [ knowledge graphs ]


Abstract:

Formulating and answering logical queries is a standard communication interface for knowledge graphs (KGs). Alleviating the notorious incompleteness of real-world KGs, neural methods achieved impressive results in link prediction and complex query answering tasks by learning representations of entities, relations, and queries. Still, most existing query answering methods rely on transductive entity embeddings and cannot generalize to KGs containing new entities without retraining entity embeddings. In this work, we study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities. To this end, we devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs).Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones. Exploring the efficiency--effectiveness trade-off, we find the inductive relational structure representation method generally achieves higher performance, while the inductive node representation method is able to answer complex queries in the inference-only regime without any training on queries and scale to graphs of millions of nodes. Code is available at https://github.com/DeepGraphLearning/InductiveQE

Chat is not available.