-
51浏览
-
0点赞
-
0收藏
-
0分享
-
0下载
-
0评论
-
引用
期刊论文
Statistical margin error bounds for L1-norm support vector machines
Neurocomputing,2019,339():210-216 | 2019年04月28日 | https://doi.org/10.1016/j.neucom.2019.02.015
Comparing with Lp-norm () Support Vector Machines (SVMs), the L1-norm SVM enjoys the nice property of simultaneously performing classification and feature selection. Margin error bounds for SVM on Hilbert spaces (or on more general q-uniformly smooth Banach spaces) have been obtained in the literature to justify the strategy of maximizing the margin in SVM. In this paper, we devote to estimating the margin error bound for L1-norm SVM methods and giving a geometrical interpretation for the result. We show that the fat-shattering dimension of the Banach spaces ℓ1 and ℓ∞ are both infinite. Therefore, we establish margin error bounds for the SVM on finite dimensional spaces with L1-norm, thus supplying statistical justification for the large margin classification of L1-norm SVM on finite dimensional spaces. To complete the theory, corresponding results for the L∞-norm SVM are also presented.
学者未上传该成果的PDF文件,请等待学者更新
本学者其他成果
同领域成果