Timezone: »
We develop a probabilistic framework for deep learning based on the Deep Rendering Mixture Model (DRMM), a new generative probabilistic model that explicitly capture variations in data due to latent task nuisance variables. We demonstrate that max-sum inference in the DRMM yields an algorithm that exactly reproduces the operations in deep convolutional neural networks (DCNs), providing a first principles derivation. Our framework provides new insights into the successes and shortcomings of DCNs as well as a principled route to their improvement. DRMM training via the Expectation-Maximization (EM) algorithm is a powerful alternative to DCN back-propagation, and initial training results are promising. Classification based on the DRMM and other variants outperforms DCNs in supervised digit classification, training 2-3x faster while achieving similar accuracy. Moreover, the DRMM is applicable to semi-supervised and unsupervised learning tasks, achieving results that are state-of-the-art in several categories on the MNIST benchmark and comparable to state of the art on the CIFAR10 benchmark.
Author Information
Ankit Patel (Baylor College of Medicine and Rice University)
Tan Nguyen (Rice University)
I am currently a postdoctoral scholar in the Department of Mathematics at the University of California, Los Angeles, working with Dr. Stanley J. Osher. I have obtained my Ph.D. in Machine Learning from Rice University, where I was advised by Dr. Richard G. Baraniuk. My research is focused on the intersection of Deep Learning, Probabilistic Modeling, Optimization, and ODEs/PDEs. I gave an invited talk in the Deep Learning Theory Workshop at NeurIPS 2018 and organized the 1st Workshop on Integration of Deep Neural Models and Differential Equations at ICLR 2020. I also had two awesome long internships with Amazon AI and NVIDIA Research, during which he worked with Dr. Anima Anandkumar. I am the recipient of the prestigious Computing Innovation Postdoctoral Fellowship (CIFellows) from the Computing Research Association (CRA), the NSF Graduate Research Fellowship, and the IGERT Neuroengineering Traineeship. I received his MSEE and BSEE from Rice in May 2018 and May 2014, respectively.
Richard Baraniuk (Rice University)
More from the Same Authors
-
2020 Workshop: Workshop on Deep Learning and Inverse Problems »
Reinhard Heckel · Paul Hand · Richard Baraniuk · Lenka Zdeborová · Soheil Feizi -
2020 Poster: Analytical Probability Distributions and Exact Expectation-Maximization for Deep Generative Networks »
Randall Balestriero · Sebastien PARIS · Richard Baraniuk -
2020 Poster: MomentumRNN: Integrating Momentum into Recurrent Neural Networks »
Tan Nguyen · Richard Baraniuk · Andrea Bertozzi · Stanley Osher · Bao Wang -
2020 Poster: Neural Networks with Recurrent Generative Feedback »
Yujia Huang · James Gornet · Sihui Dai · Zhiding Yu · Tan Nguyen · Doris Tsao · Anima Anandkumar -
2020 Poster: Bongard-LOGO: A New Benchmark for Human-Level Concept Learning and Reasoning »
Weili Nie · Zhiding Yu · Lei Mao · Ankit Patel · Yuke Zhu · Anima Anandkumar -
2020 Spotlight: Bongard-LOGO: A New Benchmark for Human-Level Concept Learning and Reasoning »
Weili Nie · Zhiding Yu · Lei Mao · Ankit Patel · Yuke Zhu · Anima Anandkumar -
2019 Workshop: Solving inverse problems with deep networks: New architectures, theoretical foundations, and applications »
Reinhard Heckel · Paul Hand · Richard Baraniuk · Joan Bruna · Alexandros Dimakis · Deanna Needell -
2019 Poster: The Geometry of Deep Networks: Power Diagram Subdivision »
Randall Balestriero · Romain Cosentino · Behnaam Aazhang · Richard Baraniuk -
2018 Workshop: Integration of Deep Learning Theories »
Richard Baraniuk · Anima Anandkumar · Stephane Mallat · Ankit Patel · nhật Hồ -
2018 Workshop: Machine Learning for Geophysical & Geochemical Signals »
Laura Pyrak-Nolte · James Rustad · Richard Baraniuk -
2017 Workshop: Advances in Modeling and Learning Interactions from Complex Data »
Gautam Dasarathy · Mladen Kolar · Richard Baraniuk -
2017 Poster: Learned D-AMP: Principled Neural Network based Compressive Image Recovery »
Chris Metzler · Ali Mousavi · Richard Baraniuk -
2016 Workshop: Machine Learning for Education »
Richard Baraniuk · Jiquan Ngiam · Christoph Studer · Phillip Grimaldi · Andrew Lan -
2014 Workshop: Human Propelled Machine Learning »
Richard Baraniuk · Michael Mozer · Divyanshu Vats · Christoph Studer · Andrew E Waters · Andrew Lan -
2013 Poster: When in Doubt, SWAP: High-Dimensional Sparse Recovery from Correlated Measurements »
Divyanshu Vats · Richard Baraniuk -
2011 Poster: SpaRCS: Recovering low-rank and sparse matrices from compressive measurements »
Andrew E Waters · Aswin C Sankaranarayanan · Richard Baraniuk -
2009 Workshop: Manifolds, sparsity, and structured models: When can low-dimensional geometry really help? »
Richard Baraniuk · Volkan Cevher · Mark A Davenport · Piotr Indyk · Bruno Olshausen · Michael B Wakin -
2008 Poster: Sparse Signal Recovery Using Markov Random Fields »
Volkan Cevher · Marco F Duarte · Chinmay Hegde · Richard Baraniuk -
2008 Spotlight: Sparse Signal Recovery Using Markov Random Fields »
Volkan Cevher · Marco F Duarte · Chinmay Hegde · Richard Baraniuk -
2007 Poster: Random Projections for Manifold Learning »
Chinmay Hegde · Richard Baraniuk