Timezone: »
Human brains are commonly modeled as networks of Regions of Interest (ROIs) and their connections for the understanding of brain functions and mental disorders. Recently, Transformer-based models have been studied over different types of data, including graphs, shown to bring performance gains widely. In this work, we study Transformer-based models for brain network analysis. Driven by the unique properties of data, we model brain networks as graphs with nodes of fixed size and order, which allows us to (1) use connection profiles as node features to provide natural and low-cost positional information and (2) learn pair-wise connection strengths among ROIs with efficient attention weights across individuals that are predictive towards downstream analysis tasks. Moreover, we propose an Orthonormal Clustering Readout operation based on self-supervised soft clustering and orthonormal projection. This design accounts for the underlying functional modules that determine similar behaviors among groups of ROIs, leading to distinguishable cluster-aware node embeddings and informative graph embeddings. Finally, we re-standardize the evaluation pipeline on the only one publicly available large-scale brain network dataset of ABIDE, to enable meaningful comparison of different models. Experiment results show clear improvements of our proposed Brain Network Transformer on both the public ABIDE and our restricted ABCD datasets. The implementation is available at https://github.com/Wayfear/BrainNetworkTransformer.
Author Information
Xuan Kan
Wei Dai (Stanford University)
Hejie Cui (Emory University)
Zilong Zhang (University of International Business and Economics)
Ying Guo
Carl Yang (Emory University)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: Brain Network Transformer »
Tue. Nov 29th 05:00 -- 07:00 PM Room Hall J #904
More from the Same Authors
-
2021 Spotlight: Subgraph Federated Learning with Missing Neighbor Generation »
Ke ZHANG · Carl Yang · Xiaoxiao Li · Lichao Sun · Siu Ming Yiu -
2022 : Shift-Robust Node Classification via Graph Clustering Co-training »
Qi Zhu · Chao Zhang · Chanyoung Park · Carl Yang · Jiawei Han -
2023 Poster: Better with Less: A Data-Centric Prespective on Pre-Training Graph Neural Networks »
Jiarong Xu · Renhong Huang · XIN JIANG · Yuxuan Cao · Carl Yang · Chunping Wang · YANG YANG -
2023 Poster: Open Visual Knowledge Extraction via Relation-Oriented Multimodality Model Prompting »
Hejie Cui · Xinyu Fang · Zihan Zhang · Ran Xu · Xuan Kan · Xin Liu · Manling Li · Yangqiu Song · Carl Yang -
2023 Poster: WalkLM: A Uniform Language Model Fine-tuning Framework for Attributed Graph Embedding »
Yanchao Tan · Zihao Zhou · Hang Lv · Weiming Liu · Carl Yang -
2022 Spotlight: Lightning Talks 2A-3 »
David Buterez · Chengan He · Xuan Kan · Yutong Lin · Konstantin Schürholt · Yu Yang · Louis Annabi · Wei Dai · Xiaotian Cheng · Alexandre Pitti · Ze Liu · Jon Paul Janet · Jun Saito · Boris Knyazev · Mathias Quoy · Zheng Zhang · James Zachary · Steven J Kiddle · Xavier Giro-i-Nieto · Chang Liu · Hejie Cui · Zilong Zhang · Hakan Bilen · Damian Borth · Dino Oglic · Holly Rushmeier · Han Hu · Xiangyang Ji · Yi Zhou · Nanning Zheng · Ying Guo · Pietro Liò · Stephen Lin · Carl Yang · Yue Cao -
2021 Poster: Subgraph Federated Learning with Missing Neighbor Generation »
Ke ZHANG · Carl Yang · Xiaoxiao Li · Lichao Sun · Siu Ming Yiu -
2021 Poster: Federated Graph Classification over Non-IID Graphs »
Han Xie · Jing Ma · Li Xiong · Carl Yang -
2021 Poster: Exploiting Data Sparsity in Secure Cross-Platform Social Recommendation »
Jinming Cui · Chaochao Chen · Lingjuan Lyu · Carl Yang · Wang Li -
2021 Poster: Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization »
Qi Zhu · Carl Yang · Yidan Xu · Haonan Wang · Chao Zhang · Jiawei Han