Skip to yearly menu bar Skip to main content


Poster

Graph Normalizing Flows

Jenny Liu · Aviral Kumar · Jimmy Ba · Jamie Kiros · Kevin Swersky

East Exhibition Hall B + C #142

Keywords: [ Relational Learning ] [ Algorithms -> Density Estimation; Algorithms ] [ Deep Learning ] [ Generative Models ]


Abstract:

We introduce graph normalizing flows: a new, reversible graph neural network model for prediction and generation. On supervised tasks, graph normalizing flows perform similarly to message passing neural networks, but at a significantly reduced memory footprint, allowing them to scale to larger graphs. In the unsupervised case, we combine graph normalizing flows with a novel graph auto-encoder to create a generative model of graph structures. Our model is permutation-invariant, generating entire graphs with a single feed-forward pass, and achieves competitive results with the state-of-the art auto-regressive models, while being better suited to parallel computing architectures.

Live content is unavailable. Log in and register to view live content