Timezone: »
Supervised learning with deep models has tremendous potential for applications in materials science. Recently, graph neural networks have been used in this context, drawing direct inspiration from models for molecules. However, materials are typically much more structured than molecules, which is a feature that these models do not leverage. In this work, we introduce a class of models that are equivariant with respect to crystalline symmetry groups. We do this by defining a generalization of the message passing operations that can be used with more general permutation groups, or that can alternatively be seen as defining an expressive convolution operation on the crystal graph. Empirically, these models achieve competitive results with state-of-the-art on the Materials Project dataset.
Author Information
Oumar Kaba (Mila, McGill University)
Siamak Ravanbakhsh (McGill / MILA)
More from the Same Authors
-
2022 : Equivariance with Learned Canonical Mappings »
Oumar Kaba · Arnab Mondal · Yan Zhang · Yoshua Bengio · Siamak Ravanbakhsh -
2022 : Equivariance with Learned Canonical Mappings »
Oumar Kaba · Arnab Mondal · Yan Zhang · Yoshua Bengio · Siamak Ravanbakhsh -
2022 Poster: Structuring Representations Using Group Invariants »
Mehran Shakerinava · Arnab Kumar Mondal · Siamak Ravanbakhsh -
2021 Poster: Gradient Starvation: A Learning Proclivity in Neural Networks »
Mohammad Pezeshki · Oumar Kaba · Yoshua Bengio · Aaron Courville · Doina Precup · Guillaume Lajoie -
2015 Poster: Embedding Inference for Structured Multilabel Prediction »
Farzaneh Mirzazadeh · Siamak Ravanbakhsh · Nan Ding · Dale Schuurmans