Normalization in Attention Dynamics
Nikita Karagodin · Shu Ge · Yury Polyanskiy · Philippe Rigollet
Abstract
We study the effect of normalization schemes on token representations in deep transformers. Modeling their evolution as interacting particles on the sphere, we show that normalization acts as a form of speed regulation. This perspective enables a unified analysis of several schemes---including Post-LN, Pre-LN, Mix-LN, Peri-LN, nGPT, and LN-scaling---revealing how they influence clustering dynamics and representation collapse. Our framework clarifies how different schemes shape token representations across layers and provides a principled basis for comparing them, identifying Peri-LN as a particularly effective choice.
Successful Page Load