Timezone: »

 
Complete the Missing Half: Augmenting Aggregation Filtering with Diversification for Graph Convolutional Networks
Sitao Luan · Mingde Zhao · Chenqing Hua · Xiao-Wen Chang · Doina Precup
Event URL: https://openreview.net/forum?id=uX8toL3-Qqh »
The core operation of current Graph Neural Networks (GNNs) is the aggregation enabled by the graph Laplacian or message passing, which filters the neighborhood node information. Though effective for various tasks, in this paper, we show that they are potentially a problematic factor underlying all GNN methods for learning on certain datasets, as they force the node representations similar, making the nodes gradually lose their identity and become indistinguishable. Hence, we augment the aggregation operations with their dual, i.e. diversification operators that make the node more distinct and preserve the identity. Such augmentation replaces the aggregation with a two-channel filtering process that, in theory, is beneficial for enriching the node representations. In practice, the proposed two-channel filters can be easily patched on existing GNN methods with diverse training strategies, including spectral and spatial (message passing) methods. In the experiments, we observe desired characteristics of the models and significant performance boost upon the baselines on $9$ node classification tasks.

Author Information

Sitao Luan (McGill University, Mila)

I’m a second year Ph.D. student working with Professor Doina Precup and Professor Xiao-Wen Chang on the cross area of reinforcement learning and matrix computations. I’m currently interested in approximate dynamic programming and Krylov subspace methods. I'm currently working on constructiong basis functions for value function approximation in model-based reinforcement learning.

Mingde Zhao (McGill University)
Chenqing Hua (McGill University)
Xiao-Wen Chang (McGill University)
Doina Precup (McGill University / Mila / DeepMind Montreal)

More from the Same Authors