Poster
Multi-objects Generation with Amortized Structural Regularization
Kun Xu · Chongxuan LI · Jun Zhu · Bo Zhang
East Exhibition Hall B, C #115
Keywords: [ Generative Models ] [ Deep Learning ] [ Probabilistic Methods ] [ Variational Inference ]
Deep generative models (DGMs) have shown promise in image generation. However, most of the existing methods learn a model by simply optimizing a divergence between the marginal distributions of the model and the data, and often fail to capture rich structures, such as attributes of objects and their relationships, in an image. Human knowledge is a crucial element to the success of DGMs to infer these structures, especially in unsupervised learning. In this paper, we propose amortized structural regularization (ASR), which adopts posterior regularization (PR) to embed human knowledge into DGMs via a set of structural constraints. We derive a lower bound of the regularized log-likelihood in PR and adopt the amortized inference technique to jointly optimize the generative model and an auxiliary recognition model for inference efficiently. Empirical results show that ASR outperforms the DGM baselines in terms of inference performance and sample quality.
Live content is unavailable. Log in and register to view live content