Sorry, you need to enable JavaScript to visit this website.

A Generalized Kernel Risk Sensitive Loss for Robust Two-dimensional Singular Value Decomposition

Citation Author(s):
Miaohua Zhang, Yongsheng Gao, Jun Zhou
Submitted by:
Miaohua Zhang
Last updated:
5 May 2022 - 3:32am
Document Type:
Presentation Slides
Document Year:
2022
Event:
Presenters:
Miaohua Zhang
Paper Code:
1579
 

Two-dimensional singular value decomposition (2DSVD) is an important dimensionality reduction algorithm which has inherent advantage in preserving the structure of 2D images. However, 2DSVD algorithm is based on the squared error loss, which may exaggerate the projection errors in the presence of outliers. To solve this problem, we propose a generalized kernel risk sensitive loss for measuring the projection error in 2DSVD(GKRSL-2DSVD). The outliers information will be automatically eliminated during optimization. Since the proposed objective function is non-convex, a majorization-minimization algorithm is developed to efficiently solve it. The proposed method is rotational invariant and has inherent properties of processing non-centered data. Experimental results on public datasets demonstrate that the performance of the proposed method on different applications significantly outperforms that of all the benchmarks.

up
0 users have voted: