Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Second Workshop on Quantum Tensor Networks in Machine Learning

Bayesian Tensor Networks

Kriton Konstantinidis · Yao Lei Xu · Qibin Zhao · Danilo Mandic


Abstract:

Tensor network (TN) methods have proven their considerable potential in determin-istic regression and classification related paradigms, but remain underexplored in probabilistic settings. To this end, we introduce a variational inference TN frame-work for supervised learning, referred to as the Bayesian Tensor Network (BTN).This is achieved by making use of the multi-linear nature of tensor networks to con-struct a structured variational model which scales linearly with data dimensionality.The so imposed low rank structure on the tensor mean and Kronecker separability on the local covariances, make it possible to efficiently induce weight dependencies in the posterior distribution, thus enhancing model expressiveness at a drastically lower parameter complexity compared to the standard mean-field approach. A comprehensive validation of the proposed approach indicates the competitiveness of BTNs against modern structured Bayesian neural network approaches, while exhibiting enhanced interpretability and efficiency

Chat is not available.