Timezone: »
Poster
A Residual Bootstrap for HighDimensional Regression with Near LowRank Designs
Miles Lopes
We study the residual bootstrap (RB) method in the context of highdimensional linear regression. Specifically, we analyze the distributional approximation of linear contrasts $c^{\top}(\hat{\beta}_{\rho}\beta)$, where $\hat{\beta}_{\rho}$ is a ridgeregression estimator. When regression coefficients are estimated via least squares, classical results show that RB consistently approximates the laws of contrasts, provided that $p\ll n$, where the design matrix is of size $n\times p$. Up to now, relatively little work has considered how additional structure in the linear model may extend the validity of RB to the setting where $p/n\asymp 1$. In this setting, we propose a version of RB that resamples residuals obtained from ridge regression. Our main structural assumption on the design matrix is that it is nearly low rank  in the sense that its singular values decay according to a powerlaw profile. Under a few extra technical assumptions, we derive a simple criterion for ensuring that RB consistently approximates the law of a given contrast. We then specialize this result to study confidence intervals for mean response values $X_i^{\top} \beta$, where $X_i^{\top}$ is the $i$th row of the design. More precisely, we show that conditionally on a Gaussian design with near lowrank structure, RB \emph{simultaneously} approximates all of the laws $X_i^{\top}(\hat{\beta}_{\rho}\beta)$, $i=1,\dots,n$. This result is also notable as it imposes no sparsity assumptions on $\beta$. Furthermore, since our consistency results are formulated in terms of the Mallows (Kantorovich) metric, the existence of a limiting distribution is not required.
Author Information
Miles Lopes (UC Davis)
Related Events (a corresponding poster, oral, or spotlight)

2014 Spotlight: A Residual Bootstrap for HighDimensional Regression with Near LowRank Designs »
Wed. Dec 10th 03:10  03:30 PM Room Level 2, room 210
More from the Same Authors

2011 Poster: A More Powerful TwoSample Test in High Dimensions using Random Projection »
Miles Lopes · Laurent Jacob · Martin J Wainwright