Timezone: »

 
Spotlight
Online Learning in The Manifold of Low-Rank Matrices
Uri Shalit · Daphna Weinshall · Gal Chechik

Wed Dec 08 03:20 PM -- 03:25 PM (PST) @ Regency Ballroom

When learning models that are represented in matrix forms, enforcing
a low-rank constraint can dramatically improve the memory and run
time complexity, while providing a natural regularization of the
model. However, naive approaches for minimizing functions over the
set of low-rank matrices are either prohibitively time
consuming (repeated singular value decomposition of the matrix) or
numerically unstable (optimizing a factored representation of the
low rank matrix). We build on recent advances in optimization over
manifolds, and describe an iterative online learning procedure, consisting of a
gradient step, followed by a second-order retraction back to the
manifold. While the ideal retraction is hard to compute, and so is
the projection operator that approximates it, we describe another
second-order retraction that can be computed efficiently, with run
time and memory complexity of O((n+m)k) for a rank-k
matrix of dimension m x n, given rank one gradients. We use
this algorithm, LORETA, to learn a matrix-form similarity measure over pairs of
documents represented as high dimensional vectors. LORETA
improves the mean average precision over a passive-
aggressive approach in a factorized model, and also improves over
a full model trained over pre-selected features using the same
memory requirements. LORETA also showed consistent improvement over
standard methods in a large (1600 classes) multi-label image classification task.

Author Information

Uri Shalit (Technion)
Daphna Weinshall (Hebrew Univeristy of Jerusalem)
Gal Chechik (NVIDIA, BIU)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors