Journal of Software:2002.13(10):2007-2013

An Improved Sequential Minimization Optimization Algorithm for Support Vector Machine Training
SUN Jian,ZHENG Nan-ning,ZHANG Zhi-hua
Chart / table
Similar Articles
Article :Browse 2481   Download 4051
Received:December 07, 2000    Revised:November 05, 2001
> 中文摘要: 对于大规模问题,分解方法是训练支撑向量机主要的一类方法.在很多分类问题中,有相当比例的支撑向量对应的拉格朗日乘子达到惩罚上界,而且在训练过程中到达上界的拉格朗日乘子变化平稳.利用这一统计特性,提出了一种有效的缓存策略来加速这类分解方法,并将其具体应用于Platt的贯序最小优化(sequential minimization optimization,简称SMO) 算法中.实验结果表明,改进后的SMO算法的速度是原有算法训练的2~3倍.
Abstract:The decomposition methods are main family to train SVM (support vector machine) for large-scale problem. In many pattern classification problems, most support vectors?Lagrangian multipliers are bound, and those multipliers change smoothly during training phases. Based on the facts, an efficient caching strategy is proposed to accelerate the decomposition methods in this paper. Platt抯 sequential minimization optimization (SMO) algorithm is improved by this caching strategy. The experimental results show that the modified algorithm can be 2~3 times faster than the classical SMO for large real-world data sets.
文章编号:     中图分类号:    文献标志码:
基金项目:国家自然科学基金资助项目(60175006;60024301);国家创新研究群体科学基金项目(60024301) 国家自然科学基金资助项目(60175006;60024301);国家创新研究群体科学基金项目(60024301)
Foundation items:
Reference text:


SUN Jian,ZHENG Nan-ning,ZHANG Zhi-hua.An Improved Sequential Minimization Optimization Algorithm for Support Vector Machine Training.Journal of Software,2002,13(10):2007-2013