Algorithmic Regularization in Tensor Optimization: Towards a Lifted Approach in Matrix Sensing

Ziye Ma · Javad Lavaei · Somayeh Sojoudi

Great Hall & Hall B1+B2 (level 1) #1201
[ ]
Thu 14 Dec 8:45 a.m. PST — 10:45 a.m. PST


Gradient descent (GD) is crucial for generalization in machine learning models, as it induces implicit regularization, promoting compact representations. In this work, we examine the role of GD in inducing implicit regularization for tensor optimization, particularly within the context of the lifted matrix sensing framework. This framework has been recently proposed to address the non-convex matrix sensing problem by transforming spurious solutions into strict saddles when optimizing over symmetric, rank-1 tensors. We show that, with sufficiently small initialization scale, GD applied to this lifted problem results in approximate rank-1 tensors and critical points with escape directions. Our findings underscore the significance of the tensor parametrization of matrix sensing, in combination with first-order methods, in achieving global optimality in such problems.

Chat is not available.