Sorry, you need to enable JavaScript to visit this website.

CYCLIC ANNEALING TRAINING CONVOLUTIONAL NEURAL NETWORKS FOR IMAGE CLASSIFICATION WITH NOISY LABELS

Citation Author(s):
Jiawei Li, Tao Dai, Qingtao Tang, Yeli Xing, Shu-Tao Xia
Submitted by:
Jiawei Li
Last updated:
8 October 2018 - 5:30am
Document Type:
Presentation Slides
Document Year:
2018
Event:
Presenters:
Jiawei Li
Paper Code:
2605, MA.L1.5

Abstract

Noisy labels modeling makes a convolutional neural network (CNN) more robust for the image classification problem. However, current noisy labels modeling methods usually require an expectation-maximization (EM) based procedure to optimize the parameters, which is computationally expensive. In this paper, we utilize a fast annealing training method to speed up the CNN training in every M-step. Since the training is repeated executed along the entire EM optimization path and obtain many local minimal CNN models from every training cycle, we name it as the Cyclic Annealing Training (CAT) approach. In addition to reducing the training time, CAT can further bagging all the local minimal CNN models at the test time to improve the performance of classification. We evaluate the proposed method on several image classification datasets with different noisy labels patterns, and the results show that our CAT approach outperforms state-of-the-art noisy labels modeling methods.

up
0 users have voted:

Files

MA.L1.5_2605_CAT_CNN_NL_v4.pdf

(231)