Graph Contrastive Learning (GCL) establishes a new paradigm for learning graph representations without human annotations. Although remarkable progress has been witnessed recently, the success behind GCL is still left somewhat mysterious. In this work, we first identify several critical design considerations within a general GCL paradigm, including augmentation functions, contrasting modes, contrastive objectives, and negative mining strategies. Then, to understand the interplay of different GCL components, we conduct comprehensive, controlled experiments over benchmark tasks on datasets across various domains. Our empirical studies suggest a set of general receipts for effective GCL, e.g., simple topology augmentations that produce sparse graph views bring promising performance improvements; contrasting modes should be aligned with the granularities of end tasks. In addition, to foster future research and ease the implementation of GCL algorithms, we develop an easy-to-use library PyGCL, featuring modularized CL components, standardized evaluation, and experiment management. We envision this work to provide useful empirical evidence of effective GCL algorithms and offer several insights for future research.
Yanqiao Zhu (Institute of Automation <br/> Chinese Academy of Sciences)
I am currently pursuing my master’s degree of Computer Science at Institute of Automation, Chinese Academy of Sciences, under joint supervision of Professor Tieniu TAN and Professor Shu WU. I am also closely working with Professor Qiang LIU from the same institute and Professor Carl J. YANG from Emory University. My research interests mainly lie in the fields of machine learning with an emphasis on graph representation learning and its application to recommender systems and brain network analysis.
Yichen Xu (Beijing University of Post and Telecommunication)
Qiang Liu (Institute of Automation, Chinese Academy of Sciences)
Shu Wu (Institute of automation, Chinese academy of science, Chinese Academy of Sciences)
More from the Same Authors
2022 : Physics-Guided Discovery of Highly Nonlinear Parametric Partial Differential Equations »
Yingtao Luo · Qiang Liu · Yuntian Chen · Wenbo Hu · TIAN TIAN · Jun Zhu
2022 : Improving Molecular Pretraining with Complementary Featurizations »
Yanqiao Zhu · Dingshuo Chen · Yuanqi Du · Yingze Wang · Qiang Liu · Shu Wu