Skip to yearly menu bar Skip to main content


Poster

Supervised Learning with Tensor Networks

Edwin M Stoudenmire · David Schwab

Area 5+6+7+8 #31

Keywords: [ Sparsity and Feature Selection ] [ Kernel Methods ]


Abstract:

Tensor networks are approximations of high-order tensors which are efficient to work with and have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing tensor networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize non-linear kernel learning models. For the MNIST data set we obtain less than 1% test set classification error. We discuss an interpretation of the additional structure imparted by the tensor network to the learned model.

Live content is unavailable. Log in and register to view live content