Timezone: »
How does neural population process sensory information? Optimal coding theories assume that neural tuning curves are adapted to the prior distribution of the stimulus variable. Most of the previous work has discussed optimal solutions for only one-dimensional stimulus variables. Here, we expand some of these ideas and present new solutions that define optimal tuning curves for high-dimensional stimulus variables. We consider solutions for a minimal case where the number of neurons in the population is equal to the number of stimulus dimensions (diffeomorphic). In the case of two-dimensional stimulus variables, we analytically derive optimal solutions for different optimal criteria such as minimal L2 reconstruction error or maximal mutual information. For higher dimensional case, the learning rule to improve the population code is provided.
Author Information
Zhuo Wang (Facebook Reality Labs)
Alan A Stocker (University of Pennsylvania)
Daniel Lee (Samsung Research/Cornell University)
More from the Same Authors
-
2020 Expo Talk Panel: Building Neural Interfaces: When Real and Artificial Neurons Meet »
Ricardo Monti · Nathalie T.H Gayraud · Jeffrey Seely · Zhuo Wang · Tugce Tasci · Rebekkah Hogan -
2017 : Poster Session (encompasses coffee break) »
Beidi Chen · Borja Balle · Daniel Lee · iuri frosio · Jitendra Malik · Jan Kautz · Ke Li · Masashi Sugiyama · Miguel A. Carreira-Perpinan · Ramin Raziperchikolaei · Theja Tulabandhula · Yung-Kyun Noh · Adams Wei Yu -
2017 Poster: Generative Local Metric Learning for Kernel Regression »
Yung-Kyun Noh · Masashi Sugiyama · Kee-Eung Kim · Frank Park · Daniel Lee -
2016 Poster: Human Decision-Making under Limited Time »
Pedro Ortega · Alan A Stocker -
2016 Poster: Efficient Neural Codes under Metabolic Constraints »
Zhuo Wang · Xue-Xin Wei · Alan A Stocker · Daniel Lee -
2016 Poster: Maximizing Influence in an Ising Network: A Mean-Field Optimal Solution »
Christopher W Lynn · Daniel Lee -
2014 Workshop: Novel Trends and Applications in Reinforcement Learning »
Csaba Szepesvari · Marc Deisenroth · Sergey Levine · Pedro Ortega · Brian Ziebart · Emma Brunskill · Naftali Tishby · Gerhard Neumann · Daniel Lee · Sridhar Mahadevan · Pieter Abbeel · David Silver · Vicenç Gómez -
2013 Poster: Optimal integration of visual speed across different spatiotemporal frequency channels »
Matjaz Jogan · Alan A Stocker -
2012 Poster: Optimal Neural Tuning Curves for Arbitrary Stimulus Distributions: Discrimax, Infomax and Minimum $L_p$ Loss »
Zhuo Wang · Alan A Stocker · Daniel Lee -
2012 Poster: Diffusion Decision Making for Adaptive k-Nearest Neighbor Classification »
Yung-Kyun Noh · Frank Park · Daniel Lee -
2012 Poster: Efficient coding connects prior and likelihood function in perceptual Bayesian inference »
Xue-Xin Wei · Alan A Stocker -
2010 Poster: Learning via Gaussian Herding »
Yacov Crammer · Daniel Lee -
2010 Poster: Generative Local Metric Learning for Nearest Neighbor Classification »
Yung-Kyun Noh · Byoung-Tak Zhang · Daniel Lee -
2008 Poster: Extended Grassmann Kernels for Subspace-Based Learning »
Jihun Hamm · Daniel Lee -
2007 Session: Session 8: Neuroscience I »
Alan A Stocker -
2007 Oral: Blind channel identification for speech dereverberation using l1-norm sparse learning »
Yuanqing Lin · Jingdong Chen · Youngmoo E Kim · Daniel Lee -
2007 Poster: Blind channel identification for speech dereverberation using l1-norm sparse learning »
Yuanqing Lin · Jingdong Chen · Youngmoo E Kim · Daniel Lee -
2007 Poster: A Bayesian Model of Conditioned Perception »
Alan A Stocker · Eero Simoncelli