Spotlight
in
Workshop: Generative AI and Biology (GenBio@NeurIPS2023)
Contributed talk | Binding Oracle: Fine-Tuning From Stability to Binding Free Energy
Chengyue Gong · Adam Klivans · Jordan Wells · James Loy · Qiang Liu · Alex Dimakis · Daniel Diaz
Keywords: [ binding free energy; sparse finetuning ]
[
Abstract
]
[ Project Page ]
presentation:
Generative AI and Biology (GenBio@NeurIPS2023)
Sat 16 Dec 6:15 a.m. PST — 3:30 p.m. PST
[
OpenReview]
Sat 16 Dec 1:30 p.m. PST
— 1:40 p.m. PST
Sat 16 Dec 6:15 a.m. PST — 3:30 p.m. PST
Abstract:
The ability to predict changes in binding free energy (▵▵$G_{bind}$) for mutations at protein-protein interfaces (PPIs) is critical for the understanding genetic diseases and engineering novel protein-based therapeutics. Here, we present Binding Oracle: a structure-based graph transformer for predicting ▵▵$G_{bind}$ at PPIs. Binding Oracle fine-tunes Stability Oracle with Selective LoRA: a technique that synergizes layer selection via gradient norms with LoRA. Selective LoRA enables the identification and fine-tuning of the layers most critical for the downstream task, thus, regularizing against overfitting. Additionally, we present new training-test splits of mutational data from the SKEMPI2.0, Ab-Bind, and NABE databases that use a strict 30\% sequence similarity threshold to avoid data leakage during model evaluation. Binding Oracle, when trained with the Thermodynamic Permutations data augmentation technique , achieves SOTA on S487 without using any evolutionary auxiliary features. Our results empirically demonstrate how sparse fine-tuning techniques, such as Selective LoRA, can enable rapid domain adaptation in protein machine learning frameworks.
Chat is not available.