Timezone: »

 
Poster
Efficient Graph Generation with Graph Recurrent Attention Networks
Renjie Liao · Yujia Li · Yang Song · Shenlong Wang · Will Hamilton · David Duvenaud · Raquel Urtasun · Richard Zemel

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #124

We propose a new family of efficient and expressive deep generative models of graphs, called Graph Recurrent Attention Networks (GRANs). Our model generates graphs one block of nodes and associated edges at a time. The block size and sampling stride allow us to trade off sample quality for efficiency. Compared to previous RNN-based graph generative models, our framework better captures the auto-regressive conditioning between the already-generated and to-be-generated parts of the graph using Graph Neural Networks (GNNs) with attention. This not only reduces the dependency on node ordering but also bypasses the long-term bottleneck caused by the sequential nature of RNNs. Moreover, we parameterize the output distribution per block using a mixture of Bernoulli, which captures the correlations among generated edges within the block. Finally, we propose to handle node orderings in generation by marginalizing over a family of canonical orderings. On standard benchmarks, we achieve state-of-the-art time efficiency and sample quality compared to previous models. Additionally, we show our model is capable of generating large graphs of up to 5K nodes with good quality. Our code is released at: \url{https://github.com/lrjconan/GRAN}.

Author Information

Renjie Liao (University of Toronto)
Yujia Li (DeepMind)
Yang Song (Stanford University)
Shenlong Wang (University of Toronto)
Will Hamilton (McGill)
David Duvenaud (University of Toronto)
Raquel Urtasun (Uber ATG)
Richard Zemel (Vector Institute/University of Toronto)

More from the Same Authors