Timezone: »
Substring kernels are classical tools for representing biological sequences or text. However, when large amounts of annotated data is available, models that allow end-to-end training such as neural networks are often prefered. Links between recurrent neural networks (RNNs) and substring kernels have recently been drawn, by formally showing that RNNs with specific activation functions were points in a reproducing kernel Hilbert space (RKHS). In this paper, we revisit this link by generalizing convolutional kernel networks---originally related to a relaxation of the mismatch kernel---to model gaps in sequences. It results in a new type of recurrent neural network which can be trained end-to-end with backpropagation, or without supervision by using kernel approximation techniques. We experimentally show that our approach is well suited to biological sequences, where it outperforms existing methods for protein classification tasks.
Author Information
Dexiong Chen (Inria)
Laurent Jacob (CNRS)
Julien Mairal (Inria)
More from the Same Authors
-
2021 Spotlight: Beyond Tikhonov: faster learning with self-concordant losses, via iterative regularization »
Gaspard Beugnot · Julien Mairal · Alessandro Rudi -
2022 Poster: Non-Convex Bilevel Games with Critical Point Selection Maps »
Michael Arbel · Julien Mairal -
2021 Poster: A Trainable Spectral-Spatial Sparse Coding Model for Hyperspectral Image Restoration »
Theo Bodrito · Alexandre Zouaoui · Jocelyn Chanussot · Julien Mairal -
2021 Poster: Beyond Tikhonov: faster learning with self-concordant losses, via iterative regularization »
Gaspard Beugnot · Julien Mairal · Alessandro Rudi -
2020 Poster: Unsupervised Learning of Visual Features by Contrasting Cluster Assignments »
Mathilde Caron · Ishan Misra · Julien Mairal · Priya Goyal · Piotr Bojanowski · Armand Joulin -
2020 Poster: A Flexible Framework for Designing Trainable Priors with Adaptive Smoothing and Game Encoding »
Bruno Lecouat · Jean Ponce · Julien Mairal -
2020 : Discussion Panel: Hugo Larochelle, Finale Doshi-Velez, Devi Parikh, Marc Deisenroth, Julien Mairal, Katja Hofmann, Phillip Isola, and Michael Bowling »
Hugo Larochelle · Finale Doshi-Velez · Marc Deisenroth · Devi Parikh · Julien Mairal · Katja Hofmann · Phillip Isola · Michael Bowling -
2019 Poster: On the Inductive Bias of Neural Tangent Kernels »
Alberto Bietti · Julien Mairal -
2019 Poster: A Generic Acceleration Framework for Stochastic Composite Optimization »
Andrei Kulunchakov · Julien Mairal -
2018 Poster: Unsupervised Learning of Artistic Styles with Archetypal Style Analysis »
Daan Wynen · Cordelia Schmid · Julien Mairal -
2017 Poster: Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite Sum Structure »
Alberto Bietti · Julien Mairal -
2017 Spotlight: Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite Sum Structure »
Alberto Bietti · Julien Mairal -
2017 Poster: Learning Neural Representations of Human Cognition across Many fMRI Studies »
Arthur Mensch · Julien Mairal · Danilo Bzdok · Bertrand Thirion · Gael Varoquaux -
2017 Poster: Invariance and Stability of Deep Convolutional Representations »
Alberto Bietti · Julien Mairal -
2016 Poster: End-to-End Kernel Learning with Supervised Convolutional Kernel Networks »
Julien Mairal -
2015 Poster: A Universal Catalyst for First-Order Optimization »
Hongzhou Lin · Julien Mairal · Zaid Harchaoui -
2014 Poster: Convolutional Kernel Networks »
Julien Mairal · Piotr Koniusz · Zaid Harchaoui · Cordelia Schmid -
2014 Spotlight: Convolutional Kernel Networks »
Julien Mairal · Piotr Koniusz · Zaid Harchaoui · Cordelia Schmid -
2013 Poster: Stochastic Majorization-Minimization Algorithms for Large-Scale Optimization »
Julien Mairal -
2010 Poster: Network Flow Algorithms for Structured Sparsity »
Julien Mairal · Rodolphe Jenatton · Guillaume R Obozinski · Francis Bach -
2008 Poster: SDL: Supervised Dictionary Learning »
Julien Mairal · Francis Bach · Jean A Ponce · Guillermo Sapiro · Andrew Zisserman