硬间隔支持向量机的最小二乘求解算法
首发时间:2020-09-11
摘要:完全线性可分的硬间隔支持向量机是一个凸二次规划问题,一般采用SMO算法,如果训练数据的样本容量较大时,支持向量机的训练对计算量和内存的要求较高。为克服这一缺点,本文把支持向量机的训练转化为最小二乘法迭代求解,方法是先把硬间隔支持向量机转化成最小二乘法模型,支持向量机只与支持向量有关,与其它样本点无关,最小二乘法的解与样本数量有关,通过不断缩小支持向量的搜索范围,用缩小后的范围用作最小二乘法模型的样本数据集,得到最小二乘解,如此继续得到收敛的解就是支持向量机的解。论文的最后用最小二乘法求解线性可分鸢尾花数据集的分类支持向量机,其结果与支持向量机的凸二次规划解的结果完全一致,说明了本文最小二乘解法是可行且有效。
For information in English, please click here
Least squares algorithm for hard spaced support vector machines
Abstract:The fully linearly separable hard interval support vector machine (SVM) is a convex quadratic programming problem. Generally, SMO algorithm is used. If the sample size of training data is large, the training of SVM requires more computation and memory. In order to overcome this shortcoming, this paper transforms the training of support vector machine into the least square iterative solution. The method is to transform the hard interval support vector machine into the least square method model. The support vector machine is only related to the support vector, and has nothing to do with other sample points. The solution of the least square method is related to the number of samples. By continuously reducing the search range of support vector, the reduced range is used The least square solution is obtained by using the sample data set of the least square method model, and the convergence solution is the support vector machine solution. At the end of the paper, the least square method is used to solve the classification support vector machine of linear separable iris data set. The result is consistent with the convex quadratic programming solution of support vector machine, which shows that the least square method is feasible and effective.
Keywords: Hard spaced support vector machine least square method convex quadratic programming equations
基金:
引用
No.****
同行评议
勘误表
硬间隔支持向量机的最小二乘求解算法
评论
全部评论0/1000