Documents
Presentation Slides
Learning Tucker Compression for Deep CNN
- Citation Author(s):
- Submitted by:
- xiaojuan li
- Last updated:
- 15 March 2022 - 3:20am
- Document Type:
- Presentation Slides
- Document Year:
- 2022
- Event:
- Presenters:
- Xiaojuan LI
- Categories:
- Keywords:
- Log in to post comments
Recently, tensor decomposition approaches are used to compress deep convolutional neural networks (CNN) for getting a faster CNN with fewer parameters. However, there are two problems of tensor decomposition based CNN compression approaches, one is that they usually decompose CNN layer by layer, ignoring the correlation between layers, the other is that training and compressing a CNN is separated, easily leading to local optimum of ranks. In this paper, Learning Tucker Compression (LTC) is proposed. It gets the best tucker ranks by jointly optimizing of CNN’s loss function and Tucker’s cost function, which means that training and compressing is carried out at the same time. It can directly optimize the CNN without decomposing the whole network layer by layer and can directly fine-tune the whole network without using fixed parameters. LTC is verified on two public datasets. Experiments show that LTC can make a network like ResNet, VGG faster with nearly the same classification accuracy, which surpasses current tensor decomposition approaches.
Comments
This is my representation!
This is my representation!