Timezone: »
We aim to better understand attention over nodes in graph neural networks (GNNs) and identify factors influencing its effectiveness. We particularly focus on the ability of attention GNNs to generalize to larger, more complex or noisy graphs. Motivated by insights from the work on Graph Isomorphism Networks, we design simple graph reasoning tasks that allow us to study attention in a controlled environment. We find that under typical conditions the effect of attention is negligible or even harmful, but under certain conditions it provides an exceptional gain in performance of more than 60% in some of our classification tasks. Satisfying these conditions in practice is challenging and often requires optimal initialization or supervised training of attention. We propose an alternative recipe and train attention in a weakly-supervised fashion that approaches the performance of supervised models, and, compared to unsupervised models, improves results on several synthetic as well as real datasets. Source code and datasets are available at https://github.com/bknyaz/graphattentionpool.
Author Information
Boris Knyazev (University of Guelph / Mila)
Graham Taylor (University of Guelph)
Mohamed Amer (RobustAI)
More from the Same Authors
-
2020 : Building LEGO using Deep Generative Models of Graphs »
Rylee Thompson · Graham Taylor · Terrance DeVries · Elahe Ghalebi -
2021 : An Empirical Study of Neural Kernel Bandits »
Michal Lisicki · Arash Afkanpour · Graham Taylor -
2023 Poster: A Step Towards Worldwide Biodiversity Assessment: The BIOSCAN-1M Insect Dataset »
Zahra Gharaee · ZeMing Gong · Nicholas Pellegrino · Iuliia Zarubiieva · Joakim Haurum · Scott Lowe · Jaclyn McKeown · Chris Ho · Joschka McLeod · Yi-Yun Wei · Jireh Agda · Sujeevan Ratnasingham · Dirk Steinke · Angel Chang · Graham Taylor · Paul Fieguth -
2021 : DeepRNG: Towards Deep Reinforcement Learning-Assisted Generative Testing of Software »
Chuan-Yung Tsai · Graham Taylor -
2021 : Neural Structure Mapping For Learning Abstract Visual Analogies »
Shashank Shekhar · Graham Taylor -
2021 Poster: Brick-by-Brick: Combinatorial Construction with Deep Reinforcement Learning »
Hyunsoo Chung · Jungtaek Kim · Boris Knyazev · Jinhwi Lee · Graham Taylor · Jaesik Park · Minsu Cho -
2021 Poster: Parameter Prediction for Unseen Deep Architectures »
Boris Knyazev · Michal Drozdzal · Graham Taylor · Adriana Romero Soriano -
2020 Poster: Instance Selection for GANs »
Terrance DeVries · Michal Drozdzal · Graham Taylor -
2020 Session: Orals & Spotlights Track 08: Deep Learning »
Graham Taylor · Mario Lucic -
2017 : Poster spotlights »
Hiroshi Kuwajima · Masayuki Tanaka · Qingkai Liang · Matthieu Komorowski · Fanyu Que · Thalita F Drumond · Aniruddh Raghu · Leo Anthony Celi · Christina Göpfert · Andrew Ross · Sarah Tan · Rich Caruana · Yin Lou · Devinder Kumar · Graham Taylor · Forough Poursabzi-Sangdeh · Jennifer Wortman Vaughan · Hanna Wallach -
2015 : Learning Multi-scale Temporal Dynamics with Recurrent Neural Networks »
Graham Taylor -
2011 Workshop: Big Learning: Algorithms, Systems, and Tools for Learning at Scale »
Joseph E Gonzalez · Sameer Singh · Graham Taylor · James Bergstra · Alice Zheng · Misha Bilenko · Yucheng Low · Yoshua Bengio · Michael Franklin · Carlos Guestrin · Andrew McCallum · Alexander Smola · Michael Jordan · Sugato Basu -
2011 Poster: Facial Expression Transfer with Input-Output Temporal Restricted Boltzmann Machines »
Matthew D Zeiler · Graham Taylor · Leonid Sigal · Iain Matthews · Rob Fergus -
2010 Poster: Pose-Sensitive Embedding by Nonlinear NCA Regression »
Graham Taylor · Rob Fergus · George Williams · Ian Spiro · Christoph Bregler -
2008 Poster: The Recurrent Temporal Restricted Boltzmann Machine »
Ilya Sutskever · Geoffrey E Hinton · Graham Taylor -
2006 Poster: Modeling Human Motion Using Binary Latent Variables »
Graham Taylor · Geoffrey E Hinton · Sam T Roweis -
2006 Spotlight: Modeling Human Motion Using Binary Latent Variables »
Graham Taylor · Geoffrey E Hinton · Sam T Roweis