Recognition: unknown
NoiseOut: A Simple Way to Prune Neural Networks
read the original abstract
Neural networks are usually over-parameterized with significant redundancy in the number of required neurons which results in unnecessary computation and memory usage at inference time. One common approach to address this issue is to prune these big networks by removing extra neurons and parameters while maintaining the accuracy. In this paper, we propose NoiseOut, a fully automated pruning algorithm based on the correlation between activations of neurons in the hidden layers. We prove that adding additional output neurons with entirely random targets results into a higher correlation between neurons which makes pruning by NoiseOut even more efficient. Finally, we test our method on various networks and datasets. These experiments exhibit high pruning rates while maintaining the accuracy of the original network.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Catalyst: Out-of-Distribution Detection via Elastic Scaling
Catalyst improves OOD detection by multiplicatively scaling baseline scores using channel-wise statistics from pre-pooling feature maps, reducing average FPR by 22-33% on standard benchmarks.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.