Timezone: »
Poster
Universality of Group Convolutional Neural Networks Based on Ridgelet Analysis on Groups
Sho Sonoda · Isao Ishikawa · Masahiro Ikeda
We show the universality of depth-2 group convolutional neural networks (GCNNs) in a unified and constructive manner based on the ridgelet theory. Despite widespread use in applications, the approximation property of (G)CNNs has not been well investigated. The universality of (G)CNNs has been shown since the late 2010s. Yet, our understanding on how (G)CNNs represent functions is incomplete because the past universality theorems have been shown in a case-by-case manner by manually/carefully assigning the network parameters depending on the variety of convolution layers, and in an indirect manner by converting/modifying the (G)CNNs into other universal approximators such as invariant polynomials and fully-connected networks. In this study, we formulate a versatile depth-2 continuous GCNN $S[\gamma]$ as a nonlinear mapping between group representations, and directly obtain an analysis operator, called the ridgelet trasform, that maps a given function $f$ to the network parameter $\gamma$ so that $S[\gamma]=f$. The proposed GCNN covers typical GCNNs such as the cyclic convolution on multi-channel images, networks on permutation-invariant inputs (Deep Sets), and $\mathrm{E}(n)$-equivariant networks. The closed-form expression of the ridgelet transform can describe how the network parameters are organized to represent a function. While it has been known only for fully-connected networks, this study is the first to obtain the ridgelet transform for GCNNs. By discretizing the closed-form expression, we can systematically generate a constructive proof of the $cc$-universality of finite GCNNs. In other words, our universality proofs are more unified and constructive than previous proofs.
Author Information
Sho Sonoda (RIKEN AIP)
Isao Ishikawa (Ehime University)
Masahiro Ikeda (RIKEN AIP)
More from the Same Authors
-
2021 Poster: Differentiable Multiple Shooting Layers »
Stefano Massaroli · Michael Poli · Sho Sonoda · Taiji Suzuki · Jinkyoo Park · Atsushi Yamashita · Hajime Asama -
2020 : Universal Approximation Property of Neural Ordinary Differential Equations »
Takeshi Teshima · Koichi Tojo · Masahiro Ikeda · Isao Ishikawa · Kenta Oono -
2020 Poster: Learning with Optimized Random Features: Exponential Speedup by Quantum Machine Learning without Sparsity and Low-Rank Assumptions »
Hayata Yamasaki · Sathyawageeswar Subramanian · Sho Sonoda · Masato Koashi -
2020 Poster: Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators »
Takeshi Teshima · Isao Ishikawa · Koichi Tojo · Kenta Oono · Masahiro Ikeda · Masashi Sugiyama -
2020 Oral: Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators »
Takeshi Teshima · Isao Ishikawa · Koichi Tojo · Kenta Oono · Masahiro Ikeda · Masashi Sugiyama -
2018 Poster: Metric on Nonlinear Dynamical Systems with Perron-Frobenius Operators »
Isao Ishikawa · Keisuke Fujii · Masahiro Ikeda · Yuka Hashimoto · Yoshinobu Kawahara