Timezone: »
Although first developed for the needs of quantum many-body physics and quantum computing, tensor networks (TNs) are increasingly being deployed to solve a wide range of problems in machine learning, optimization, and applied mathematics. Inspired by the distinct implementation challenges of TN methods in these new settings, we present ContracTN, a lightweight Python library for general-purpose TN calculations. Beyond the use of the dense tensor cores supported in standard TN libraries, ContracTN also supports the use of copy tensors, parameter-free objects which allow diverse concepts like batch computation, elementwise multiplication, and summation to be expressed entirely in the language of TN diagrams. The contraction engine of ContracTN also implements a novel form of stabilization, which largely mitigates the issue of numerical overflow arising from the use of low-precision machine learning libraries for TN contraction. Overall, we wish to popularize a collection of methods which have proven invaluable in implementing efficient and robust TN models, in the hope that this can help catalyze the wider adoption of TN methods for problems in machine learning.
Author Information
Jacob Miller (Mila, Univérsité de Montréal)
Guillaume Rabusseau (Mila - Université de Montréal)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 : ContracTN: A Tensor Network Library Designed for Machine Learning »
Tue. Dec 14th 07:45 -- 07:50 PM Room
More from the Same Authors
-
2021 Spotlight: Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models »
Behnoush Khavari · Guillaume Rabusseau -
2021 : Few Shot Image Generation via Implicit Autoencoding of Support Sets »
Shenyang Huang · Kuan-Chieh Wang · Guillaume Rabusseau · Alireza Makhzani -
2021 : Rademacher Random Projections with Tensor Networks »
Beheshteh Rakhshan · Guillaume Rabusseau -
2021 : Towards a Trace-Preserving Tensor Network Representation of Quantum Channels »
Siddarth Srinivasan · Sandesh Adhikary · Jacob Miller · Guillaume Rabusseau · Byron Boots -
2021 : Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework »
Jacob Miller · Geoffrey Roeder -
2022 Poster: High-Order Pooling for Graph Neural Networks with Tensor Decomposition »
Chenqing Hua · Guillaume Rabusseau · Jian Tang -
2021 : Discussion Pannel »
Xiao-Yang Liu · Qibin Zhao · Chao Li · Guillaume Rabusseau -
2021 : Towards a Trace-Preserving Tensor Network Representation of Quantum Channels »
Siddarth Srinivasan · Sandesh Adhikary · Jacob Miller · Guillaume Rabusseau · Byron Boots -
2021 : Rademacher Random Projections with Tensor Networks »
Beheshteh Rakhshan · Guillaume Rabusseau -
2021 Workshop: Second Workshop on Quantum Tensor Networks in Machine Learning »
Xiao-Yang Liu · Qibin Zhao · Ivan Oseledets · Yufei Ding · Guillaume Rabusseau · Jean Kossaifi · Khadijeh Najafi · Anwar Walid · Andrzej Cichocki · Masashi Sugiyama -
2021 Poster: Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models »
Behnoush Khavari · Guillaume Rabusseau -
2020 : Invited Talk 9 Q&A by Guillaume »
Guillaume Rabusseau -
2020 : Invited Talk 9: Tensor Network Models for Structured Data »
Guillaume Rabusseau -
2020 : Panel Discussion 1: Theoretical, Algorithmic and Physical »
Jacob Biamonte · Ivan Oseledets · Jens Eisert · Nadav Cohen · Guillaume Rabusseau · Xiao-Yang Liu -
2017 Poster: Hierarchical Methods of Moments »
Matteo Ruffini · Guillaume Rabusseau · Borja Balle -
2017 Poster: Multitask Spectral Learning of Weighted Automata »
Guillaume Rabusseau · Borja Balle · Joelle Pineau -
2016 Poster: Low-Rank Regression with Tensor Responses »
Guillaume Rabusseau · Hachem Kadri