Documents
Poster
Poster
A Scalable Convolutional Neural Network for Task-specified Scenarios via Knowledge Distillation
- Citation Author(s):
- Submitted by:
- Mengnan Shi
- Last updated:
- 12 March 2017 - 8:20pm
- Document Type:
- Poster
- Document Year:
- 2017
- Event:
- Presenters:
- Mengnan Shi
- Paper Code:
- MLSP-P2.6
- Categories:
- Log in to post comments
In this paper, we explore the redundancy in convolutional neural network, which scales with the complexity of vision tasks. Considering that many front-end visual systems are interested in only a limited range of visual targets, the removing of task-specified network redundancy can promote a wide range of potential applications. We propose a task-specified knowledge distillation algorithm to derive a simplified model with pre-set computation cost and minimized accuracy loss, which suits the resource constraint front-end systems well. Experiments on the MNIST and CIFAR10 datasets demonstrate the feasibility of the proposed approach as well as the existence of task-specified redundancy.