Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Fri Dec 11 06:00 AM -- 05:50 PM (PST)
First Workshop on Quantum Tensor Networks in Machine Learning
Xiao-Yang Liu · Qibin Zhao · Jacob Biamonte · Cesar F Caiafa · Paul Pu Liang · Nadav Cohen · Stefan Leichenauer





Workshop Home Page

Quantum tensor networks in machine learning (QTNML) are envisioned to have great potential to advance AI technologies. Quantum machine learning promises quantum advantages (potentially exponential speedups in training, quadratic speedup in convergence, etc.) over classical machine learning, while tensor networks provide powerful simulations of quantum machine learning algorithms on classical computers. As a rapidly growing interdisciplinary area, QTNML may serve as an amplifier for computational intelligence, a transformer for machine learning innovations, and a propeller for AI industrialization.

Tensor networks, a contracted network of factor tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. Underlying these algorithms is the compression of high-dimensional data needed to represent quantum states of matter. These compression techniques have recently proven ripe to apply to many traditional problems faced in deep learning. Quantum tensor networks have shown significant power in compactly representing deep neural networks, and efficient training and theoretical understanding of deep neural networks. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models. However, the topic of QTNML is relatively young and many open problems are still to be explored.

Quantum algorithms are typically described by quantum circuits (quantum computational networks). These networks are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones.

The interplay between tensor networks, machine learning and quantum algorithms is rich. Indeed, this interplay is based not just on numerical methods but on the equivalence of tensor networks to various quantum circuits, rapidly developing algorithms from the mathematics and physics communities for optimizing and transforming tensor networks, and connections to low-rank methods for learning. A merger of tensor network algorithms with state-of-the-art approaches in deep learning is now taking place. A new community is forming, which this workshop aims to foster.

Opening Remarks (Opening)
Invited Talk 1: Tensor Networks as a Data Structure in Probabilistic Modeling and for Learning Dynamical Laws from Data (Talk)
Invited Talk 1 Q&A by Jens (Q&A)
Invited Talk 2: Expressiveness in Deep Learning via Tensor Networks and Quantum Entanglement (Talk)
Invited Talk 2 Q&A by Cohen (Q&A)
Invited Talk 3: Tensor Networks and Counting Problems on the Lattice (Talk)
Invited Talk 3 Q&A by Frank (Q&A)
Invited Talk 4: Quantum in ML and ML in Quantum (Talk)
Invited Talk 4 Q&A by Ivan (Q&A)
Invited Talk 5: Live Presentation of TensorLy By Jean Kossaifi (Talk)
Invited Talk 6: A Century of the Tensor Network Formulation from the Ising Model (Talk)
Invited Talk 6 Q&A by Tomotoshi (Q&A)
Poster 1: Multi-Graph Tensor Networks by Yao Lei Xu (Poster Talk)
Poster 2: High Performance Single-Site Finite DMRG on GPUs by Hao Hong (Poster Talk)
Poster 3: Variational Quantum Circuit Model for Knowledge Graph Embeddings by Yunpu Ma (Poster Talk)
Poster 4: Hybrid quantum-classical classifier based on tensor network and variational quantum circuit by Samuel Yen-Chi Chen (Poster Talk)
Poster 5: A Neural Matching Model based on Quantum Interference and Quantum Many-body System (Poster Talk)
Contributed Talk 1: Paper 3: Tensor network approaches for data-driven identification of non-linear dynamical laws (Talk)
Contributed Talk 2: Paper 6: Anomaly Detections with Tensor Networks (Talk)
Contributed Talk 3: Paper 32: High-order Learning Model via Fractional Tensor Network Decomposition (Talk)
Panel Discussion 1: Theoretical, Algorithmic and Physical (Discussion Pannel)
Break
Panel Discussion 2: Software and High Performance Implementation (Discussion Pannel)
Break
Invited Talk 7: cuTensor: High-Performance CUDA Tensor Primitives (Talk)
Invited Talk 7 Q&A by Paul (Q&A)
Invited Talk 8: TensorNetwork: A Python Package for Tensor Network Computations (Talk)
Invited Talk 8 Q&A by Martin (Q&A)
Invited Talk 9: Tensor Network Models for Structured Data (Talk)
Invited Talk 9 Q&A by Guillaume (Q&A)
Invited Talk 10: Getting Started with Tensor Networks (Talk)
Invited Talk 10 Q&A by Evenbly (Q&A)
Contributed Talk 4: Paper 27: Limitations of gradient-based Born Machine over tensornetworks on learning quantum nonlocality (Talk)
Contributed Talk 5: Paper 19: Deep convolutional tensor network (Talk)
Poster 6: Paper 16: Quantum Tensor Networks for Variational Reinforcement Learning (Poster Talk)
Poster 7: Paper 13: Quantum Tensor Networks, Stochastic Processes, and Weighted Automata (Poster Talk)
Poster 8: Paper 24: Modeling Natural Language via Quantum Many-body Wave Function and Tensor Network, (Poster Talk)
Invited Talk 11: Tensor Methods for Efficient and Interpretable Spatiotemporal Learning (Talk)
Invited Talk 11 Q&A by Rose (Q&A)
Invited Talk 12: Learning Quantum Channels with Tensor Networks (Talk)
Invited Talk 12: Q&A (Q&A)
Closing Remarks (Talk)