Poster
Supervised Sparse Analysis and Synthesis Operators
Pablo Sprechmann · Roee Litman · Tal Ben Yakar · Alexander M Bronstein · Guillermo Sapiro

Thu Dec 5th 07:00 -- 11:59 PM @ Harrah's Special Events Center, 2nd Floor #None

In this paper, we propose a new and computationally efficient framework for learning sparse models. We formulate a unified approach that contains as particular cases models promoting sparse synthesis and analysis type of priors, and mixtures thereof. The supervised training of the proposed model is formulated as a bilevel optimization problem, in which the operators are optimized to achieve the best possible performance on a specific task, e.g., reconstruction or classification. By restricting the operators to be shift invariant, our approach can be thought as a way of learning analysis+synthesis sparsity-promoting convolutional operators. Leveraging recent ideas on fast trainable regressors designed to approximate exact sparse codes, we propose a way of constructing feed-forward neural networks capable of approximating the learned models at a fraction of the computational cost of exact solvers. In the shift-invariant case, this leads to a principled way of constructing task-specific convolutional networks. We illustrate the proposed models on several experiments in music analysis and image processing applications.

Author Information

Pablo Sprechmann (Duke University)
Roy Litman (Tel Aviv University)
Tal Ben Yakar (Tel Aviv University)
Alexander M Bronstein (Tel Aviv University)
Guillermo Sapiro (Duke University)

More from the Same Authors