Timezone: »

 
Rademacher Random Projections with Tensor Networks
Beheshteh Rakhshan · Guillaume Rabusseau

Tue Dec 14 12:10 PM -- 12:15 PM (PST) @

Random projection~(RP) have recently emerged as popular techniques in the machine learning community for their ability in reducing the dimension of very high-dimensional tensors. Following the work in \cite{rakhshan2020tensorized}, we consider a tensorized random projection relying on Tensor Train (TT) decomposition where each element of the core tensors is drawn from a Rademacher distribution. Our theoretical results reveal that the Gaussian low-rank tensor represented in compressed form in TT format in \cite{rakhshan2020tensorized} can be replaced by a TT tensor with core elements drawn from a Rademacher distribution with the same embedding size. Experiments on synthetic data demonstrate that tensorized Rademacher RP can outperform the tensorized Gaussian RP studied in \cite{rakhshan2020tensorized}. In addition, we show both theoretically and experimentally, that the tensorized RP in the Matrix Product Operator~(MPO) format proposed in \cite{batselier2018computing} for performing SVD on large matrices is not a Johnson-Lindenstrauss transform~(JLT) and therefore not a well-suited random projection map.

Author Information

Beheshteh Rakhshan (MILA)
Guillaume Rabusseau (Mila - Université de Montréal)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors