Skip to yearly menu bar Skip to main content


Poster

Mistake Bounds for Binary Matrix Completion

Mark Herbster · Stephen Pasteris · Massimiliano Pontil

Area 5+6+7+8 #20

Keywords: [ Kernel Methods ] [ Learning Theory ] [ Online Learning ]


Abstract:

We study the problem of completing a binary matrix in an online learning setting. On each trial we predict a matrix entry and then receive the true entry. We propose a Matrix Exponentiated Gradient algorithm [1] to solve this problem. We provide a mistake bound for the algorithm, which scales with the margin complexity [2, 3] of the underlying matrix. The bound suggests an interpretation where each row of the matrix is a prediction task over a finite set of objects, the columns. Using this we show that the algorithm makes a number of mistakes which is comparable up to a logarithmic factor to the number of mistakes made by the Kernel Perceptron with an optimal kernel in hindsight. We discuss applications of the algorithm to predicting as well as the best biclustering and to the problem of predicting the labeling of a graph without knowing the graph in advance.

Live content is unavailable. Log in and register to view live content