Sorry, you need to enable JavaScript to visit this website.

RANDOM MATRICES MEET MACHINE LEARNING: A LARGE DIMENSIONAL ANALYSIS OF LS-SVM

Citation Author(s):
Zhenyu Liao, Romain Couillet
Submitted by:
Zhenyu LIAO
Last updated:
9 March 2017 - 6:36pm
Document Type:
Presentation Slides
Document Year:
2017
Event:
Presenters:
Zhenyu Liao
Paper Code:
ICASSP1701
 

This article proposes a performance analysis of kernel least squares support vector machines (LS-SVMs) based on a random matrix approach, in the regime where both the dimension of data p and their number n grow large at the same rate. Under a two-class Gaussian mixture model for the input data, we prove that the LS-SVM decision function is asymptotically normal with means and covariances shown to depend explicitly on the derivatives of the kernel function. This provides improved understanding along with new insights into the internal workings of SVM-type methods for large datasets.

up
0 users have voted: