Skip to yearly menu bar Skip to main content

Workshop: AI for Science: from Theory to Practice

Gradient Estimation For Exactly-$k$ Constraints

Ruoyan Li · Dipti Ranjan Sahu · Guy Van den Broeck · Zhe Zeng

Abstract: The exactly-$k$ constraint is ubiquitous in machine learning and scientific applications, such as ensuring that the sum of electric charges in a neutral atom is zero. However, enforcing such constraints in machine learning models while allowing differentiable learning is challenging. In this work, we aim to provide a ''cookbook'' for seamlessly incorporating exactly-$k$ constraints into machine learning models by extending a recent gradient estimator from Bernoulli variables to Gaussian and Poisson variables, utilizing constraint probabilities. We show the effectiveness of our proposed gradient estimators in synthetic experiments, and further demonstrate the practical utility of our approach by training neural networks to predict partial charges for metal-organic frameworks, aiding virtual screening in chemistry. Our proposed method not only enhances the capability of learning models but also expands their applicability to a wider range of scientific domains where satisfaction of constraints is crucial.

Chat is not available.