Sorry, you need to enable JavaScript to visit this website.

We introduce a new structural technique for pruning deep neural networks with skip-connections by removing the less informative layers using their Fisher scores. Extensive experiments on the classification of CIFAR-10, CIFAR-100, and SVHN data sets demonstrate the efficacy of our proposed method in compressing deep models, both in terms of the number of parameters and operations.

Categories:
214 Views

This work outlines a method for an application of empirical Bayes in the setting of semi-supervised learning. That is, we consider a scenario in which the training set is partially or entirely unlabeled. In addition to the missing labels, we also consider a scenario where the available training data might be shuffled (i.e., the features and labels are not matched).

Categories:
56 Views