Training a neural network is synonymous with learning the values of the
weights. In contrast, we demonstrate that randomly weighted neural networks
contain subnetworks which achieve impressive performance without ever training
the weight values....

Mitchell Wortsman
254d ago

What's hidden in an overparameterized neural network with random weights? If the distribution is properly scaled (e.g. Kaiming Normal), then it contains a subnetwork which achieves high accuracy without ever modifying the values of the weights...
arxiv.org/abs/1911.13299
(/n) pic.twitter.com/RcTcgYGY9J

18

@RamanujanVivek @anikembhavi @morastegari Alternate title: Randomly weighted neural networks. What do they contain? Do they contain things? Lets find out.

What's Hidden in a Randomly Weighted Neural Network?
“Hidden in a randomly weighted Wide ResNet-50 we show that there is a subnetwork (with random weights) that is smaller than, but matches the performance of a ResNet-34 trained on ImageNet.” 😮
arxiv.org/abs/1911.13299 twitter.com/mitchnw/status…