Timezone: »

 
Poster
OPEN: Orthogonal Propagation with Ego-Network Modeling
Liang Yang · Lina Kang · Qiuliang Zhang · Mengzhe Li · bingxin niu · Dongxiao He · Zhen Wang · Chuan Wang · Xiaochun Cao · Yuanfang Guo

@

To alleviate the unfavorable effect of noisy topology in Graph Neural networks (GNNs), some efforts perform the local topology refinement through the pairwise propagation weight learning and the multi-channel extension. Unfortunately, most of them suffer a common and fatal drawback: irrelevant propagation to one node and in multi-channels. These two kinds of irrelevances make propagation weights in multi-channels free to be determined by the labeled data, and thus the GNNs are exposed to overfitting. To tackle this issue, a novel Orthogonal Propagation with Ego-Network modeling (OPEN) is proposed by modeling relevances between propagations. Specifically, the relevance between propagations to one node is modeled by whole ego-network modeling, while the relevance between propagations in multi-channels is modeled via diversity requirement. By interpreting the propagations to one node from the perspective of dimension reduction, propagation weights are inferred from principal components of the ego-network, which are orthogonal to each other. Theoretical analysis and experimental evaluations reveal four attractive characteristics of OPEN as modeling high-order relationships beyond pairwise one, preventing overfitting, robustness, and high efficiency.

Author Information

Liang Yang (Hebei University of Technology)
Lina Kang (Hebei University of Technology)
Qiuliang Zhang (河北工业大学)
Mengzhe Li (Hebei University of Technology)
bingxin niu (Hebei University of Techonology)
Dongxiao He (Jilin University, China)
Zhen Wang (Northwestern Polytechnical University)
Chuan Wang (institute of information engineering)
Xiaochun Cao (SUN YAT-SEN UNIVERSITY)
Yuanfang Guo (Beihang University)

More from the Same Authors