Skip to yearly menu bar Skip to main content


Poster

Scalable Deep Generative Relational Model with High-Order Node Dependence

Xuhui Fan · Bin Li · Caoyuan Li · Scott SIsson · Ling Chen

East Exhibition Hall B + C #149

Keywords: [ Latent Variable Models ] [ Probabilistic Methods ] [ Deep Learning ] [ Generative Models ]


Abstract:

In this work, we propose a probabilistic framework for relational data modelling and latent structure exploring. Given the possible feature information for the nodes in a network, our model builds up a deep architecture that can approximate to the possible nonlinear mappings between the nodes' feature information and latent representations. For each node, we incorporate all its neighborhoods' high-order structure information to generate latent representation, such that these latent representations are ``smooth'' in terms of the network. Since the latent representations are generated from Dirichlet distributions, we further develop a data augmentation trick to enable efficient Gibbs sampling for Ber-Poisson likelihood with Dirichlet random variables. Our model can be ready to apply to large sparse network as its computations cost scales to the number of positive links in the networks. The superior performance of our model is demonstrated through improved link prediction performance on a range of real-world datasets.

Live content is unavailable. Log in and register to view live content