Timezone: »
Complex time-varying systems are often studied by abstracting away from the dynamics of individual components to build a model of the population-level dynamics from the start. However, when building a population-level description, it can be easy to lose sight of each individual and how they contribute to the larger picture. In this paper, we present a novel transformer architecture for learning from time-varying data that builds descriptions of both the individual as well as the collective population dynamics. Rather than combining all of our data into our model at the onset, we develop a separable architecture that operates on individual time-series first before passing them forward; this induces a permutation-invariance property and can be used to transfer across systems of different size and order. After demonstrating that our model can be applied to successfully recover complex interactions and dynamics in many-body systems, we apply our approach to populations of neurons in the nervous system. On neural activity datasets, we show that our model not only yields robust decoding performance, but also provides impressive performance in transfer across recordings of different animals without any neuron-level correspondence. By enabling flexible pre-training that can be transferred to neural recordings of different size and order, our work provides a first step towards creating a foundation model for neural decoding.
Author Information
Ran Liu (Georgia Institute of Technology)
I am a 4th year Ph.D. student in the Machine Learning Program at Georgia Tech. I conduct my research in the Neural Data Science Lab advised by Prof. Eva Dyer. My research interests lie at the intersection of Machine (Deep) Learning, Computational Neuroscience, and Computer Vision.
Mehdi Azabou (Georgia Institute of Technology)
Max Dabagia (Georgia Institute of Technology)
Jingyun Xiao (Georgia Institute of Technology)
Eva Dyer (Georgia Institute of Technology)
More from the Same Authors
-
2021 : Neural Latents Benchmark ‘21: Evaluating latent variable models of neural population activity »
Felix Pei · Joel Ye · David Zoltowski · Anqi Wu · Raeed Chowdhury · Hansem Sohn · Joseph O'Doherty · Krishna V Shenoy · Matthew Kaufman · Mark Churchland · Mehrdad Jazayeri · Lee Miller · Jonathan Pillow · Il Memming Park · Eva Dyer · Chethan Pandarinath -
2022 Poster: MTNeuro: A Benchmark for Evaluating Representations of Brain Structure Across Multiple Levels of Abstraction »
Jorge Quesada · Lakshmi Sathidevi · Ran Liu · Nauman Ahad · Joy Jackson · Mehdi Azabou · Jingyun Xiao · Christopher Liding · Matthew Jin · Carolina Urzay · William Gray-Roncal · Erik Johnson · Eva Dyer -
2021 : Contributed talk 3 »
Mehdi Azabou -
2021 Oral: Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity »
Ran Liu · Mehdi Azabou · Max Dabagia · Chi-Heng Lin · Mohammad Gheshlaghi Azar · Keith Hengen · Michal Valko · Eva Dyer -
2021 Poster: Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity »
Ran Liu · Mehdi Azabou · Max Dabagia · Chi-Heng Lin · Mohammad Gheshlaghi Azar · Keith Hengen · Michal Valko · Eva Dyer -
2019 Poster: Hierarchical Optimal Transport for Multimodal Distribution Alignment »
John Lee · Max Dabagia · Eva Dyer · Christopher Rozell -
2017 : Closing Panel: Analyzing brain data from nano to macroscale »
William Gray Roncal · Eva Dyer -
2017 : Opening Remarks »
Eva Dyer · William Gray Roncal -
2017 Workshop: BigNeuro 2017: Analyzing brain data from nano to macroscale »
Eva Dyer · Gregory Kiar · William Gray Roncal · · Konrad P Koerding · Joshua T Vogelstein -
2016 : Eva Dyer »
Eva Dyer -
2016 Workshop: Brains and Bits: Neuroscience meets Machine Learning »
Alyson Fletcher · Eva Dyer · Jascha Sohl-Dickstein · Joshua T Vogelstein · Konrad Koerding · Jakob H Macke -
2015 Workshop: BigNeuro 2015: Making sense of big neural data »
Eva Dyer · Joshua T Vogelstein · Konrad Koerding · Jeremy Freeman · Andreas S. Tolias