Spotlight
Fast and Guaranteed Tensor Decomposition via Sketching
Yining Wang · Hsiao-Yu Tung · Alexander Smola · Anima Anandkumar

Thu Dec 10th 10:10 -- 10:40 AM @ Room 210 A

Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in statistical learning of latent variable models and in data mining. In this paper, we propose fast and randomized tensor CP decomposition algorithms based on sketching. We build on the idea of count sketches, but introduce many novel ideas which are unique to tensors. We develop novel methods for randomized com- putation of tensor contractions via FFTs, without explicitly forming the tensors. Such tensor contractions are encountered in decomposition methods such as ten- sor power iterations and alternating least squares. We also design novel colliding hashes for symmetric tensors to further save time in computing the sketches. We then combine these sketching ideas with existing whitening and tensor power iter- ative techniques to obtain the fastest algorithm on both sparse and dense tensors. The quality of approximation under our method does not depend on properties such as sparsity, uniformity of elements, etc. We apply the method for topic mod- eling and obtain competitive results.

Author Information

Yining Wang (Carnegie Mellon University)
Hsiao-Yu Tung (Carnegie Mellon University)
Alex Smola (Carnegie Mellon University)

**Amazon AWS Machine Learning** We are hiring!

Anima Anandkumar (UC Irvine)

More from the Same Authors