Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning and the Physical Sciences

Dynamical Mean Field Theory of Kernel Evolution in Wide Neural Networks

Blake Bordelon · Cengiz Pehlevan


Abstract:

We analyze feature learning in infinite-width neural networks trained with gradient flow through a self-consistent dynamical field theory. We construct a collection of deterministic dynamical order parameters which are inner-product kernels for hidden unit activations and gradients in each layer at pairs of time points, providing a reduced description of network activity through training. These kernel order parameters collectively define the hidden layer activation distribution, the evolution of the neural tangent kernel, and consequently output predictions. We provide a sampling procedure to self-consistently solve for the kernel order parameters.

Chat is not available.