Sorry, you need to enable JavaScript to visit this website.

Exact Incremental and Decremental Learning for LS-SVM

Citation Author(s):
Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung
Submitted by:
Shiqiang Wang
Last updated:
23 September 2019 - 12:50pm
Document Type:
Presentation Slides
Document Year:
2019
Event:
Presenters:
Shiqiang Wang
Paper Code:
TP.L5.06
 

In this paper, we present a novel incremental and decremental learning method for the least-squares support vector machine (LS-SVM). The goal is to adapt a pre-trained model to changes in the training dataset, without retraining the model on all the data, where the changes can include addition and deletion of data samples. We propose a provably exact method where the updated model is exactly the same as a model trained from scratch using the entire (updated) training dataset. Our proposed method only requires access to the updated data samples, the previous model parameters, and a unique, fixed-size matrix that quantifies the effect of the previous training dataset. Our approach can significantly reduce the storage requirement of model updating, preserve the privacy of unchanged training samples without loss of model accuracy, and enhance the computational efficiency. Experiments on real-world image dataset validate the effectiveness of our proposed method.

Link to paper: https://doi.org/10.1109/ICIP.2019.8803291

up
0 users have voted: