Timezone: »
Random projection~(RP) have recently emerged as popular techniques in the machine learning community for their ability in reducing the dimension of very high-dimensional tensors. Following the work in \cite{rakhshan2020tensorized}, we consider a tensorized random projection relying on Tensor Train (TT) decomposition where each element of the core tensors is drawn from a Rademacher distribution. Our theoretical results reveal that the Gaussian low-rank tensor represented in compressed form in TT format in \cite{rakhshan2020tensorized} can be replaced by a TT tensor with core elements drawn from a Rademacher distribution with the same embedding size. Experiments on synthetic data demonstrate that tensorized Rademacher RP can outperform the tensorized Gaussian RP studied in \cite{rakhshan2020tensorized}. In addition, we show both theoretically and experimentally, that the tensorized RP in the Matrix Product Operator~(MPO) format proposed in \cite{batselier2018computing} for performing SVD on large matrices is not a Johnson-Lindenstrauss transform~(JLT) and therefore not a well-suited random projection map.
Author Information
Beheshteh Rakhshan (MILA)
Guillaume Rabusseau (Mila - Université de Montréal)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 : Rademacher Random Projections with Tensor Networks »
Dates n/a. Room
More from the Same Authors
-
2021 Spotlight: Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models »
Behnoush Khavari · Guillaume Rabusseau -
2021 : Few Shot Image Generation via Implicit Autoencoding of Support Sets »
Shenyang Huang · Kuan-Chieh Wang · Guillaume Rabusseau · Alireza Makhzani -
2021 : Towards a Trace-Preserving Tensor Network Representation of Quantum Channels »
Siddarth Srinivasan · Sandesh Adhikary · Jacob Miller · Guillaume Rabusseau · Byron Boots -
2021 : ContracTN: A Tensor Network Library Designed for Machine Learning »
Jacob Miller · Guillaume Rabusseau -
2022 Poster: High-Order Pooling for Graph Neural Networks with Tensor Decomposition »
Chenqing Hua · Guillaume Rabusseau · Jian Tang -
2021 : Discussion Pannel »
Xiao-Yang Liu · Qibin Zhao · Chao Li · Guillaume Rabusseau -
2021 : Towards a Trace-Preserving Tensor Network Representation of Quantum Channels »
Siddarth Srinivasan · Sandesh Adhikary · Jacob Miller · Guillaume Rabusseau · Byron Boots -
2021 : ContracTN: A Tensor Network Library Designed for Machine Learning »
Jacob Miller · Guillaume Rabusseau -
2021 Workshop: Second Workshop on Quantum Tensor Networks in Machine Learning »
Xiao-Yang Liu · Qibin Zhao · Ivan Oseledets · Yufei Ding · Guillaume Rabusseau · Jean Kossaifi · Khadijeh Najafi · Anwar Walid · Andrzej Cichocki · Masashi Sugiyama -
2021 Poster: Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models »
Behnoush Khavari · Guillaume Rabusseau -
2020 : Invited Talk 9 Q&A by Guillaume »
Guillaume Rabusseau -
2020 : Invited Talk 9: Tensor Network Models for Structured Data »
Guillaume Rabusseau -
2020 : Panel Discussion 1: Theoretical, Algorithmic and Physical »
Jacob Biamonte · Ivan Oseledets · Jens Eisert · Nadav Cohen · Guillaume Rabusseau · Xiao-Yang Liu -
2017 Poster: Hierarchical Methods of Moments »
Matteo Ruffini · Guillaume Rabusseau · Borja Balle -
2017 Poster: Multitask Spectral Learning of Weighted Automata »
Guillaume Rabusseau · Borja Balle · Joelle Pineau -
2016 Poster: Low-Rank Regression with Tensor Responses »
Guillaume Rabusseau · Hachem Kadri