Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

Strong Lottery Ticket Hypothesis with $\epsilon$–perturbation

Fangshuo Liao · Zheyang Xiong · Anastasios Kyrillidis


Abstract: The strong Lottery Ticket Hypothesis (LTH) claims that there exists a subnetwork in a sufficiently large, randomly initialized neural network that approximates some target neural networks without the need of training. This work extends the theoretical guarantee of the strong LTH literature to a scenario more similar to the original LTH, by generalizing the weight change achieved in the pre-training step to some perturbation around the initialization.In particular, we focus on the following open questions: By allowing an $\varepsilon$-scale perturbation on the random initial weights, can we reduce the over-parameterization requirement for the candidate network in the strong LTH? Furthermore, does the weight change by SGD coincide with a good set of such perturbation?

Chat is not available.