Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science: Progress and Promises

Chemistry Guided Molecular Graph Transformer

Peisong Niu · Tian Zhou · Qingsong Wen · Liang Sun · Tao Yao

Keywords: [ multiscale ] [ distance-guided embedding ] [ molecular graph transformer ] [ chemistry ]


Abstract:

Classic methods to calculate molecular properties are insufficient for large amounts of data. The Transformer architecture has achieved competitive performance on graph-level prediction by introducing general graphic embedding. However, the direct spatial encoding strategy ignores important inductive bias for molecular graphs, such as aromaticity and interatomic forces. In this paper, inspired by the intrinsic properties of chemical molecules, we propose a chemistry-guided molecular graph Transformer. Specifically, motif-based spatial embedding and distance-guided multi-scale self-attention for graph Transformer are proposed to predict molecular property effectively. To evaluate the proposed methods, we have conducted experiments on two large molecular property prediction datasets, ZINC, and PCQM4M-LSC. The results show that our methods achieve superior performance compared to various state-of-the-art methods.Code is available at https://github.com/PSacfc/chemistry-graph-transformer .

Chat is not available.