Skip to yearly menu bar Skip to main content


Poster

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

Liangzu Peng · Christian Kümmerle · Rene Vidal

Hall J (level 1) #310

Keywords: [ Convergence Rate Analysis ] [ Robust Regression ] [ Sparsity ] [ Iteratively Reweighted Least-Squares ] [ Non-smooth optimization ] [ Outliers ]


Abstract: We advance both the theory and practice of robust p-quasinorm regression for p(0,1] by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, p=1, we prove that this IRLS variant converges globally at a linear rate under a mild, deterministic condition on the feature matrix called the stable range space property. In the non-convex case, p(0,1), we prove that under a similar condition, IRLS converges locally to the global minimizer at a superlinear rate of order 2p; the rate becomes quadratic as p0. We showcase the proposed methods in three applications: real phase retrieval, regression without correspondences, and robust face restoration. The results show that (1) IRLS can handle a larger number of outliers than other methods, (2) it is faster than competing methods at the same level of accuracy, (3) it restores a sparsely corrupted face image with satisfactory visual quality.

Chat is not available.