Poster
Model Selection for High-Dimensional Regression under the Generalized Irrepresentability Condition
Adel Javanmard · Andrea Montanari
Harrah's Special Events Center, 2nd Floor
[
Abstract
]
Abstract:
In the high-dimensional regression model a response variable is linearly related to pp covariates, but the sample size nn is smaller than pp. We assume that only a small subset of covariates is `active' (i.e., the corresponding coefficients are non-zero), and consider the model-selection problem of identifying the active covariates. A popular approach is to estimate the regression coefficients through the Lasso (ℓ1ℓ1-regularized least squares). This is known to correctly identify the active set only if the irrelevant covariates are roughly orthogonal to the relevant ones, as quantified through the so called irrepresentability′condition.Inthispaperwestudytheirrepresentability'condition.InthispaperwestudytheGauss-Lasso' selector, a simple two-stage method that first solves the Lasso, and then performs ordinary least squares restricted to the Lasso active set. We formulate `generalized irrepresentability condition' (GIC), an assumption that is substantially weaker than irrepresentability. We prove that, under GIC, the Gauss-Lasso correctly recovers the active set.
Live content is unavailable. Log in and register to view live content