What's hidden in an overparameterized neural network with random weights? If the distribution is properly scaled (e.g. Kaiming Normal), then it contains a subnetwork which achieves high accuracy without ever modifying the values of the weights... (/n)
Alternate title: Randomly weighted neural networks. What do they contain? Do they contain things? Lets find out.
What's Hidden in a Randomly Weighted Neural Network? “Hidden in a randomly weighted Wide ResNet-50 we show that there is a subnetwork (with random weights) that is smaller than, but matches the performance of a ResNet-34 trained on ImageNet.” 😮
very interesting, but also not so interesting bc (1) isn't finding a subset of a net eqiv. (almost) to training the net? (2) you sample more, you increase your chance.