Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Graph Learning (GLFrontiers)

Subgraphormer: Subgraph GNNs meet Graph Transformers

Guy Bar Shalom · Beatrice Bevilacqua · Haggai Maron

Keywords: [ transformers ] [ graph neural networks ] [ graph transformers ] [ subgraphs ]


Abstract:

In the realm of Graph Neural Network (GNNs), two intriguing research directions have recently emerged: Subgraph GNNs and Graph Transformers. These approaches have distinct origins -- Subgraph GNNs aim to address the limitations of message passing, while Graph Transformers seek to build on the success of sequential transformers in language and vision tasks. In this paper, we propose a model that integrates both approaches, dubbed Subgraphormer, which combines the message passing and global aggregation schemes from Subgraph GNNs with attention mechanisms and positional and structural encodings, which are arguably the most important components in Graph Transformers. Our preliminary experimental results demonstrate significant performance improvements over both Subgraph GNNs and Graph Transformers.

Chat is not available.