Timezone: »

Bayesian Tensor Networks
Kriton Konstantinidis · Yao Lei Xu · Qibin Zhao · Danilo Mandic

Tensor network (TN) methods have proven their considerable potential in determin-istic regression and classification related paradigms, but remain underexplored in probabilistic settings. To this end, we introduce a variational inference TN frame-work for supervised learning, referred to as the Bayesian Tensor Network (BTN).This is achieved by making use of the multi-linear nature of tensor networks to con-struct a structured variational model which scales linearly with data dimensionality.The so imposed low rank structure on the tensor mean and Kronecker separability on the local covariances, make it possible to efficiently induce weight dependencies in the posterior distribution, thus enhancing model expressiveness at a drastically lower parameter complexity compared to the standard mean-field approach. A comprehensive validation of the proposed approach indicates the competitiveness of BTNs against modern structured Bayesian neural network approaches, while exhibiting enhanced interpretability and efficiency

Author Information

Kriton Konstantinidis (Imperial College London)
Yao Lei Xu (Imperial College London)
Qibin Zhao (RIKEN AIP)
Danilo Mandic (Imperial College London)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors