Spotlight Poster
Molecule Design by Latent Prompt Transformer
Deqian Kong · Yuhao Huang · Jianwen Xie · Edouardo Honig · Ming Xu · Shuanghong Xue · Pei Lin · Sanping Zhou · Sheng Zhong · Nanning Zheng · Ying Nian Wu
East Exhibit Hall A-C #2909
Recent advancements in conditional generative modeling have enabled the generation of high-quality language and images based on prompts. These advancements can be extended to the challenging problem of molecule design by framing it as a conditional generative modeling task, with target biological properties or desired chemical constraints as conditioning variables.We propose the Latent Prompt Transformer (LPT) model, a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution modeled by a neural transformation of a Gaussian white noise; (2) a molecule generation model using a causal Transformer with the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target property value based on the latent prompt. LPT can be learned by maximum likelihood estimation on molecule-property pairs. During property optimization, the latent prompt is inferred from expected property and constraints via posterior sampling and then guides the autoregressive molecule generation.After initial training on existing molecules and their properties, we progressively shift the model distribution towards regions supporting desired target properties. Experiments show that LPT effectively discovers useful molecules in single-objective, multi-objective, and structure-constrained optimization tasks.
Live content is unavailable. Log in and register to view live content