Sorry, you need to enable JavaScript to visit this website.

CONTRASTIVE-CENTER LOSS FOR DEEP NEURAL NETWORKS

Citation Author(s):
Submitted by:
Ce Qi
Last updated:
13 September 2017 - 10:46pm
Document Type:
Poster
Document Year:
2017
Event:
Presenters:
Ce Qi
Paper Code:
1592
 

The deep convolutional neural network(CNN) has significantly raised the performance of image classification and face
recognition. Softmax is usually used as supervision, but it only penalizes the classification loss. In this paper, we propose a novel auxiliary supervision signal called contrastive-center loss, which can further enhance the discriminative power of the features, for it learns a class center for each class. The proposed contrastive-center loss simultaneously considers intra-class compactness and inter-class separability, by penalizing the contrastive values between: (1)the distances of training samples to their corresponding class centers, and (2)the sum of the distances of training samples to their non-corresponding class centers. Experiments on different datasets demonstrate the effectiveness of contrastive-center loss.

up
0 users have voted: