Sorry, you need to enable JavaScript to visit this website.

COMPRESSING DEEP NETWORKS USING FISHER SCORE OF FEATURE MAPS

Citation Author(s):
Mohammadreza Soltani, Suya Wu, Yuerong Li, Robert Ravier, Jie Ding, and Vahid Tarokh
Submitted by:
Mohammadreza Soltani
Last updated:
28 February 2021 - 9:06pm
Document Type:
Presentation Slides
Document Year:
2021
Event:
Presenters:
Mohammadreza Soltani
Categories:
Keywords:
 

We introduce a new structural technique for pruning deep neural networks with skip-connections by removing the less informative layers using their Fisher scores. Extensive experiments on the classification of CIFAR-10, CIFAR-100, and SVHN data sets demonstrate the efficacy of our proposed method in compressing deep models, both in terms of the number of parameters and operations. For instance, for the classification of CIFAR-10 images, our method can compress a ResNet56 model with 0.85 million parameters and 126 million operations with 75% and 62% reduction in the number of parameters and the number of operations, respectively, while increasing the test error only by 0.03%.

up
0 users have voted: