Poster
Extending the Design Space of Graph Neural Networks by Rethinking Folklore Weisfeiler-Lehman
Jiarui Feng · Lecheng Kong · Hao Liu · Dacheng Tao · Fuhai Li · Muhan Zhang · Yixin Chen
Great Hall & Hall B1+B2 (level 1) #631
Abstract:
Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by -WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) -WL/FWL requires at least space complexity, which is impractical for large graphs even when ; (2) The design space of -WL/FWL is rigid, with the only adjustable hyper-parameter being . To tackle the first limitation, we propose an extension, -FWL. We theoretically prove that even if we fix the space complexity to (for any ) in -FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose -FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of -FWL. Combining these two modifications results in a flexible and powerful framework -FWL+. We demonstrate -FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of -FWL+ called Neighborhood-FWL (N-FWL), which is practically and theoretically sound. We prove that N-FWL is no less powerful than 3-WL, and can encode many substructures while only requiring space. Finally, we design its neural version named **N-GNN** and evaluate its performance on various tasks. N-GNN achieves record-breaking results on ZINC-Subset (**0.059**), outperforming previous SOTA results by 10.6\%. Moreover, N-GNN achieves new SOTA results on the BREC dataset (**71.8\%**) among all existing high-expressive GNN methods.
Chat is not available.