Sorry, you need to enable JavaScript to visit this website.

facebooktwittermailshare

Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones

Abstract: 

We address the problem of adding new classes to an existing classifier without hurting the original classes, when no access is allowed to any sample from the original classes. This problem arises frequently since models are often shared without their training data, due to privacy and data ownership concerns. We propose an easy-to-use approach that modifies the original classifier by retraining a suitable subset of layers using a linearly-tuned, knowledge-distillation regularization. The set of layers that is tuned depends on the number of new added classes and the number of original classes.We evaluate the proposed method on two standard datasets, first in a language-identification task, then in an image classification setup. In both cases, the method achieves classification accuracy that is almost as good as that obtained by a system trained using unrestricted samples from both the original and new classes.

up
0 users have voted:

Paper Details

Authors:
Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger
Submitted On:
15 May 2019 - 7:51am
Short Link:
Type:
Poster
Event:
Presenter's Name:
Hagai Taitelbaum
Paper Code:
MLSP-P13.5
Document Year:
2019
Cite

Document Files

https://ieeexplore.ieee.org/document/8682848

(17)

Subscribe

[1] Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger, "Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4523. Accessed: Jul. 18, 2019.
@article{4523-19,
url = {http://sigport.org/4523},
author = {Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger },
publisher = {IEEE SigPort},
title = {Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones},
year = {2019} }
TY - EJOUR
T1 - Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones
AU - Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4523
ER -
Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger. (2019). Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones. IEEE SigPort. http://sigport.org/4523
Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger, 2019. Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones. Available at: http://sigport.org/4523.
Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger. (2019). "Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones." Web.
1. Hagai Taitelbaum ; Gal Chechik ; Jacob Goldberger. Network Adaptation Strategies for Learning New Classes without Forgetting the Original Ones [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4523