Skip to main content
caenopy

How many degrees of freedom do we need to train deep networks?

Read the paper

B. W. Larsen, S. Fort, N. Becker, and S. Ganguli. How many degrees of freedom do we need to train deep networks: a loss landscape perspective. In International Conference on Learning Representations (ICLR), 2022. arXiv:2107.05802

Deep neural networks are capable of training and generalizing well in many low-dimensional manifolds in their weights. We explain this phenomenon by first examining the success probability of hitting a training loss sublevel set when training within a random subspace of a given training dimensionality using Gordon's escape theorem.