Timezone: »

 
Poster
Learning Conjoint Attentions for Graph Neural Nets
Tiantian He · Yew Soon Ong · L Bai

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @ Virtual

In this paper, we present Conjoint Attentions (CAs), a class of novel learning-to-attend strategies for graph neural networks (GNNs). Besides considering the layer-wise node features propagated within the GNN, CAs can additionally incorporate various structural interventions, such as node cluster embedding, and higher-order structural correlations that can be learned outside of GNN, when computing attention scores. The node features that are regarded as significant by the conjoint criteria are therefore more likely to be propagated in the GNN. Given the novel Conjoint Attention strategies, we then propose Graph conjoint attention networks (CATs) that can learn representations embedded with significant latent features deemed by the Conjoint Attentions. Besides, we theoretically validate the discriminative capacity of CATs. CATs utilizing the proposed Conjoint Attention strategies have been extensively tested in well-established benchmarking datasets and comprehensively compared with state-of-the-art baselines. The obtained notable performance demonstrates the effectiveness of the proposed Conjoint Attentions.

Author Information

Tiantian He (Agency for Science, Technology and Research (A*STAR))

Tiantian He is currently a Research Scientist at IHPC, Agency for Science, Technology and Research (A*STAR). His research interests include AI, Computational Intelligence, Data-Centric Transfer Optimization, and Data Mining.

Yew Soon Ong (Nanyang Technological University)
L Bai (Nanyang Technological University)

More from the Same Authors