Timezone: »
We perform a data-driven dimensionality reduction of the 4-point vertex function characterizing the functional Renormalization Group (fRG) flow for the widely studied two-dimensional t-t' Hubbard model on the square lattice. We show that a deep learning architecture based on a Neural Ordinary Differential Equations efficiently learns the evolution of low-dimensional latent variables in all relevant magnetic and d-wave superconducting regimes of the Hubbard model. Ultimately, our work uses an encoder-decoder architecture to extract compact representations of the 4-point vertex functions for correlated electrons, a goal of utmost importance for the success of cutting-edge methods for tackling the many-electron problem.
Author Information
Matija Medvidović (Columbia University)
Alessandro Toschi (TU Wien)
Giorgio Sangiovanni (University of Wuerzburg)
Cesare Franchini (University of Bologna)
Andy Millis (Flatiron Institute)
Anirvan Sengupta (Flatiron Institute)
Domenico Di Sante (Flatiron Institute (CCQ Affiliate))
I am an Assistant Professor at the University of Bologna and a Marie Curie Research Fellow at the Center for Computational Quantum Physics of the Flatiron Institute in New York. I earned a B.S. in physics at the University of L’Aquila in 2011 and a Ph.D. in physics in 2015. I subsequently was a postdoctoral fellow and young group leader at the Physics Department of the University of Würzburg. My research is focused on the numerical quantum simulations of the electronic properties of non-interacting and interacting material systems, with emphasis on topology and spin-orbit driven phenomena. I am the principal investigator of an individual Marie Curie fellowship which aims at using machine learning applications in condensed matter problems.
More from the Same Authors
-
2021 Spotlight: A Normative and Biologically Plausible Algorithm for Independent Component Analysis »
Yanis Bahroun · Dmitri Chklovskii · Anirvan Sengupta -
2021 Spotlight: Neural optimal feedback control with local learning rules »
Johannes Friedrich · Siavash Golkar · Shiva Farashahi · Alexander Genkin · Anirvan Sengupta · Dmitri Chklovskii -
2021 : Classical variational simulation of the Quantum Approximate Optimization Algorithm »
Matija Medvidović · Giuseppe Carleo -
2021 Poster: A Normative and Biologically Plausible Algorithm for Independent Component Analysis »
Yanis Bahroun · Dmitri Chklovskii · Anirvan Sengupta -
2021 Poster: Neural optimal feedback control with local learning rules »
Johannes Friedrich · Siavash Golkar · Shiva Farashahi · Alexander Genkin · Anirvan Sengupta · Dmitri Chklovskii -
2020 Poster: A simple normative network approximates local non-Hebbian learning in the cortex »
Siavash Golkar · David Lipshutz · Yanis Bahroun · Anirvan Sengupta · Dmitri Chklovskii -
2019 Poster: A Similarity-preserving Network Trained on Transformed Images Recapitulates Salient Features of the Fly Motion Detection Circuit »
Yanis Bahroun · Dmitri Chklovskii · Anirvan Sengupta -
2018 Poster: Manifold-tiling Localized Receptive Fields are Optimal in Similarity-preserving Neural Networks »
Anirvan Sengupta · Cengiz Pehlevan · Mariano Tepper · Alexander Genkin · Dmitri Chklovskii