Timezone: »
Recent work has shown the potential of graph neural networks to efficiently predict material properties, enabling high-throughput screening of materials. Training these models, however, often requires large quantities of labelled data, obtained via costly methods such as ab initio calculations or experimental evaluation. By leveraging a series of material-specific transformations, we introduce CrystalCLR, a framework for constrastive learning of representations with crystal graph neural networks. With the addition of a novel loss function, our framework is able to learn representations competitive with engineered fingerprinting methods. We also demonstrate that via model finetuning, contrastive pretraining can improve the performance of graph neural networks for prediction of material properties and significantly outperform traditional ML models that use engineered fingerprints. Lastly, we observe that CrystalCLR produces material representations that form clusters by compound class.
Author Information
Teddy Koker (MIT Lincoln Laboratory)
Keegan Quigley
Will Spaeth
Nathan Frey (Prescient Design • Genentech)
Lin Li (MIT Lincoln Laboratory)
More from the Same Authors
-
2021 : Scalable Geometric Deep Learning on Molecular Graphs »
Nathan Frey · Siddharth Samsi · Lin Li · Connor Coley -
2022 : A Pareto-optimal compositional energy-based model for sampling and optimization of protein sequences »
Nataša Tagasovska · Nathan Frey · Andreas Loukas · Isidro Hotzel · Julien Lafrance-Vanasse · Ryan Kelly · Yan Wu · Arvind Rajpal · Richard Bonneau · Kyunghyun Cho · Stephen Ra · Vladimir Gligorijevic -
2023 Poster: Encoding Time-Series Explanations through Self-Supervised Model Behavior Consistency »
Owen Queen · Thomas Hartvigsen · Teddy Koker · Huan He · Theodoros Tsiligkaridis · Marinka Zitnik -
2023 Poster: Protein Design with Guided Discrete Diffusion »
Nate Gruver · Samuel Stanton · Nathan Frey · Tim G. J. Rudner · Isidro Hotzel · Julien Lafrance-Vanasse · Arvind Rajpal · Kyunghyun Cho · Andrew Wilson