Double Descent and Overparametrization in Particle Physics Data
Matthias Vigl · Lukas Heinrich
Abstract
Recently, the benefit of heavily overparameterized models has been observed in machine learning tasks: models with enough capacity to easily cross the interpolation threshold improve in generalization error compared to the classical bias-variance tradeoff regime. We demonstrate this behavior for the first time in particle physics data and explore when and where `double descent' appears and under which circumstances overparameterization results in a performance gain.
Chat is not available.
Successful Page Load