Poster
Phase transitions for high-dimensional joint support recovery
Sahand N Negahban · Martin J Wainwright
[
Abstract
]
Abstract:
We consider the following instance of transfer learning: given a pair of regression problems, suppose that the regression coefficients share a partially common support, parameterized by the overlap fraction between the two supports. This set-up suggests the use of -regularized linear regression for recovering the support sets of both regression vectors. Our main contribution is to provide a sharp characterization of the sample complexity of this relaxation, exactly pinning down the minimal sample size required for joint support recovery as a function of the model dimension , support size and overlap . For measurement matrices drawn from standard Gaussian ensembles, we prove that the joint -regularized method undergoes a phase transition characterized by order parameter . More precisely, the probability of successfully recovering both supports converges to for scalings such that , and converges to to scalings for which . An implication of this threshold is that use of -regularization leads to gains in sample complexity if the overlap parameter is large enough (), but performs worse than a naive approach if . We illustrate the close agreement between these theoretical predictions, and the actual behavior in simulations. Thus, our results illustrate both the benefits and dangers associated with block- regularization in high-dimensional inference.
Live content is unavailable. Log in and register to view live content