Poster
in
Workshop: AI for Science: from Theory to Practice
MoleCLUEs: Molecular Conformers Maximally In-Distribution for Predictive Models
Michael Maser · Nataša Tagasovska · Jae Hyeon Lee · Andrew Watkins
Structure-based molecular ML (SBML) models can be highly sensitive to input geometries and give predictions with large variance.We present an approach to mitigate the challenge of selecting conformations for such models by generating conformers that explicitly minimize predictive uncertainty. To achieve this, we compute estimates of aleatoric and epistemic uncertainties that are differentiable w.r.t. latent posteriors. We then iteratively sample new latents in the direction of lower uncertainty by gradient descent. As we train our predictive models jointly with a conformer decoder, the new latent embeddings can be mapped to their corresponding inputs, which we call MoleCLUEs, or (molecular) counterfactual latent uncertainty explanations (Antorán et al, 2020). We assess our algorithm for the task of predicting drug-target binding from 3D structure with maximum confidence. We additionally analyze the structure trajectories obtained from conformer optimizations, which provide insight into the sources of uncertainty in SBML.