Timezone: »
In Convolutional Neural Networks (CNNs) information flows across a small neighbourhood of each pixel of an image, preventing long-range integration of features before reaching deep layers in the network. Inspired by the neurons of the human visual cortex responding to similar but distant visual features, we propose a novel architecture that allows efficient information flow between features $z$ and locations $(x,y)$ across the entire image with a small number of layers.This architecture uses a cycle of three orthogonal convolutions, not only in $(x,y)$ coordinates, but also in $(x,z)$ and $(y,z)$ coordinates. We stack a sequence of such cycles to obtain our deep network, named CycleNet. When compared to CNNs of similar size, our model obtains competitive results at image classification on CIFAR-10 and ImageNet datasets.We hypothesise that long-range integration favours recognition of objects by shape rather than texture, and we show that CycleNet transfers better than CNNs to stylised images. On the Pathfinder challenge, where integration of distant features is crucial, CycleNet outperforms CNNs by a large margin. Code has been made available at: https://github.com/netX21/Submission
Author Information
Federica Freddi (MediaTek Research)
Federica is currently working as Deep Learning Researcher at MediaTek Research, having recently graduated in Information Engineering from the University of Cambridge in June 2019. Her master thesis focused on computer vision, for which she introduced a new framework to improve the scalability of current state-of-the-art semantic localisation algorithms by introducing a novel semi-automatic labeling pipeline. During her time in university, Federica has been involved in several activities and competitions; most notable of which was winning The Open Data Championship UK in 2017, going on to represent the UK in the world championship. She has also had experience in Data Engineering and iOS Software Engineering through internships. Federica started coding at a young age and, while still in high school, she took part in an international summer school developing a web platform to identify microbiota dysbiosis in childhood using machine learning. She was recognized as a potential future leader in engineering by the Royal Academy of Engineering in 2017, receiving an Engineering Leadership Scholarship. Federica was the overall winner of the UK DevelopHer Awards in 2018.
Jezabel Garcia (MediaTek Research)
Michael Bromberg (Mediatek Research)
Sepehr Jalali (Mediatek Research)
Da-shan Shiu (University of California Berkeley)
Alvin Chua (MedaiTek Research)
Alberto Bernacchia (MediaTek Research)
More from the Same Authors
-
2021 : How to distribute data across tasks for meta-learning? »
Alexandru Cioba · Michael Bromberg · Qian Wang · RITWIK NIYOGI · Georgios Batzolis · Jezabel Garcia · Da-shan Shiu · Alberto Bernacchia -
2022 : Gradient Descent: Robustness to Adversarial Corruption »
Fu-Chieh Chang · Farhang Nabiei · Pei-Yuan Wu · Alexandru Cioba · Sattar Vakili · Alberto Bernacchia -
2021 Poster: Natural continual learning: success is a journey, not (just) a destination »
Ta-Chu Kao · Kristopher Jensen · Gido van de Ven · Alberto Bernacchia · Guillaume Hennequin -
2021 Poster: Optimal Order Simple Regret for Gaussian Process Bandits »
Sattar Vakili · Nacime Bouziani · Sepehr Jalali · Alberto Bernacchia · Da-shan Shiu -
2020 Poster: Non-reversible Gaussian processes for identifying latent dynamical structure in neural data »
Virginia Rutten · Alberto Bernacchia · Maneesh Sahani · Guillaume Hennequin -
2020 Oral: Non-reversible Gaussian processes for identifying latent dynamical structure in neural data »
Virginia Rutten · Alberto Bernacchia · Maneesh Sahani · Guillaume Hennequin -
2018 Poster: Exact natural gradient in deep linear networks and its application to the nonlinear case »
Alberto Bernacchia · Mate Lengyel · Guillaume Hennequin