Skip to yearly menu bar Skip to main content


Talk
in
Workshop: First Workshop on Quantum Tensor Networks in Machine Learning

Invited Talk 9: Tensor Network Models for Structured Data

Guillaume Rabusseau


Abstract:

In this talk, I will present uniform tensor network models (also known translation invariant tensor networks) which are particularly suited for modelling structured data such as sequences and trees. Uniform tensor networks are tensor networks where the core tensors appearing in the decomposition of a given tensor are all equal, which can be seen as a weight sharing mechanism in tensor networks. In the first part of the talk, I will show how uniform tensor networks are particularly suited to represent functions defined over sets of structured objects such as sequences and trees. I will then present how these models are related to classical computational models such as hidden Markov models, weighted automata, second-order recurrent neural networks and context free grammars. In the second part of the talk, I will present a classical learning algorithm for weighted automata and show how and it can be interpreted as a mean to convert non-uniform tensor networks to uniform ones. Lastly, I will present ongoing work leveraging the tensor network formalism to design efficient and versatile probabilistic models for sequence data.