Timezone: »
High dimensional regression benefits from sparsity promoting regularizations. Screening rules leverage the known sparsity of the solution by ignoring some variables in the optimization, hence speeding up solvers. When the procedure is proven not to discard features wrongly the rules are said to be safe. In this paper we derive new safe rules for generalized linear models regularized with L1 and L1/L2 norms. The rules are based on duality gap computations and spherical safe regions whose diameters converge to zero. This allows to discard safely more variables, in particular for low regularization parameters. The GAP Safe rule can cope with any iterative solver and we illustrate its performance on coordinate descent for multi-task Lasso, binary and multinomial logistic regression, demonstrating significant speed ups on all tested datasets with respect to previous safe rules.
Author Information
Eugene Ndiaye (Institut Mines-Télécom, Télécom ParisTech, CNRS LTCI)
Olivier Fercoq (Telecom ParisTech)
Alexandre Gramfort (Telecom Paristech)
Joseph Salmon (Télécom ParisTech)
More from the Same Authors
-
2021 : Pl@ntNet-300K: a plant image dataset with high label ambiguity and a long-tailed distribution »
Camille Garcin · alexis joly · Pierre Bonnet · Antoine Affouard · Jean-Christophe Lombardo · Mathias Chouet · Maximilien Servajean · Titouan Lorieul · Joseph Salmon -
2020 Poster: Modeling Shared responses in Neuroimaging Studies through MultiView ICA »
Hugo Richard · Luigi Gresele · Aapo Hyvarinen · Bertrand Thirion · Alexandre Gramfort · Pierre Ablin -
2020 Spotlight: Modeling Shared responses in Neuroimaging Studies through MultiView ICA »
Hugo Richard · Luigi Gresele · Aapo Hyvarinen · Bertrand Thirion · Alexandre Gramfort · Pierre Ablin -
2020 Poster: Statistical control for spatio-temporal MEG/EEG source imaging with desparsified mutli-task Lasso »
Jerome-Alexis Chevalier · Joseph Salmon · Alexandre Gramfort · Bertrand Thirion -
2019 Poster: Handling correlated and repeated measurements with the smoothed multivariate square-root Lasso »
Quentin Bertrand · Mathurin Massias · Alexandre Gramfort · Joseph Salmon -
2019 Poster: Learning step sizes for unfolded sparse coding »
Pierre Ablin · Thomas Moreau · Mathurin Massias · Alexandre Gramfort -
2019 Poster: Stochastic Frank-Wolfe for Composite Convex Minimization »
Francesco Locatello · Alp Yurtsever · Olivier Fercoq · Volkan Cevher -
2019 Poster: Manifold-regression to predict from MEG/EEG brain signals without source modeling »
David Sabbagh · Pierre Ablin · Gael Varoquaux · Alexandre Gramfort · Denis A. Engemann -
2018 Poster: Multivariate Convolutional Sparse Coding for Electromagnetic Brain Signals »
Tom Dupré la Tour · Thomas Moreau · Mainak Jas · Alexandre Gramfort -
2017 Poster: Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization »
Ahmet Alacaoglu · Quoc Tran Dinh · Olivier Fercoq · Volkan Cevher -
2016 Poster: GAP Safe Screening Rules for Sparse-Group Lasso »
Eugene Ndiaye · Olivier Fercoq · Alexandre Gramfort · Joseph Salmon -
2016 Poster: Joint quantile regression in vector-valued RKHSs »
Maxime Sangnier · Olivier Fercoq · Florence d'Alché-Buc -
2015 Poster: Extending Gossip Algorithms to Distributed Estimation of U-statistics »
Igor Colin · Aurélien Bellet · Joseph Salmon · Stéphan Clémençon -
2015 Spotlight: Extending Gossip Algorithms to Distributed Estimation of U-statistics »
Igor Colin · Aurélien Bellet · Joseph Salmon · Stéphan Clémençon -
2014 Poster: Probabilistic low-rank matrix completion on finite alphabets »
Jean Lafond · Olga Klopp · Eric Moulines · Joseph Salmon