Documents
Presentation Slides
On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections
- Citation Author(s):
- Submitted by:
- Mohammadreza Soltani
- Last updated:
- 4 March 2022 - 6:37pm
- Document Type:
- Presentation Slides
- Event:
- Presenters:
- Mohammadreza Soltani
- Categories:
- Log in to post comments
We propose a new structured pruning framework for compressing Deep Neural Networks
(DNNs) with skip-connections, based on measuring the statistical dependency of hidden
layers and predicted outputs. The dependence measure defined by the energy statistics of
hidden layers serves as a model-free measure of information between the feature maps and
the output of the network. The estimated dependence measure is subsequently used to
prune a collection of redundant and uninformative layers. Model-freeness of our measure
guarantees that no parametric assumptions on the feature maps distribution are required,
making it computationally appealing for very high dimensional feature space in DNNs.
Extensive numerical experiments on various architectures show the efficacy of the proposed
pruning approach with competitive performance to state-of-the-art methods.