###
Journal of Software:2018.29(1):109-130

可扩展机器学习的并行与分布式优化算法综述
亢良伊,王建飞,刘杰,叶丹
(中国科学院 软件研究所 软件工程技术研发中心, 北京 100190;中国科学院大学, 北京 100190;中国科学院 软件研究所 软件工程技术研发中心, 北京 100190;计算机科学国家重点实验室(中国科学院 软件研究所), 北京 100190)
Survey on Parallel and Distributed Optimization Algorithms for Scalable Machine Learning
KANG Liang-Yi,WANG Jian-Fei,LIU Jie,YE Dan
(Technology Center of Software Engineering, Institute of Software, The Chinese Academy of Sciences, Beijing 100190, China;University of Chinese Academy of Sciences, Beijing 100190, China;Technology Center of Software Engineering, Institute of Software, The Chinese Academy of Sciences, Beijing 100190, China;State Key Laboratory of Computer Science(Institute of Software, The Chinese Academy of Sciences), Beijing 100190, China)
Abstract
Chart / table
Reference
Similar Articles
Article :Browse 2965   Download 5254
Received:May 05, 2017    Revised:June 09, 2017
> 中文摘要: 机器学习问题通常会转换成一个目标函数去求解,优化算法是求解目标函数中参数的重要工具.在大数据环境下,需要设计并行与分布式的优化算法,通过多核计算和分布式计算技术来加速训练过程.近年来,该领域涌现了大量研究工作,部分算法也在各机器学习平台得到广泛应用.针对梯度下降算法、二阶优化算法、邻近梯度算法、坐标下降算法、交替方向乘子算法这5类最常见的优化方法展开研究,每一类算法分别从单机并行和分布式并行来分析相关研究成果,并从模型特性、输入数据特性、算法评价、并行计算模型等角度对每种算法进行详细对比.随后,对有代表性的可扩展机器学习平台中优化算法的实现和应用情况进行对比分析.同时,对所介绍的所有优化算法进行多层次分类,方便用户根据目标函数类型选择合适的优化算法,也可以通过该多层次分类图交叉探索如何将优化算法应用到新的目标函数类型.最后分析了现有优化算法存在的问题,提出可能的解决思路,并对未来研究方向进行展望.
Abstract:Machine learning problems can be viewed as optimization-centric programs, and the optimization algorithm is an important tool to solve the objective function. In the era of big data, in order to speed up the training process, it is essential to design parallel and distributed optimization algorithms by multi-core computing and distributed computing technologies. In recent years, there are a lot of research works in this field, and some algorithms have been widely applied on machine learning platforms. In this paper, five common optimization algorithms, including gradient descent algorithm, second order optimization algorithm, proximal gradient algorithm, coordinate descent algorithm and alternating direction method of multiplier, are studied. Each type of algorithm is analyzed from the view of parallel and distributed respectively, and algorithms of the same type are compared by their model type, input data characteristic, algorithm evaluation and parallel communication mode. In addition, the implementations and applications of the optimization algorithm on representative scalable machine learning platforms are analyzed. Meanwhile, all the optimization algorithms introduced in this paper are categorized by a hierarchical classification diagram, which can be used as a tool to select the appropriate optimization algorithm according to the objective function type, and also to cross explore how to apply optimization algorithms to the new objective function type. Finally, the problems of the existing optimization algorithms are discussed, and the possible solutions and the future research directions are proposed.
文章编号:     中图分类号:    文献标志码:
基金项目:国家自然科学基金(U1435220);北京市科技重大项目(D171100003417002);民航科技重大专项(MHRD20160109) 国家自然科学基金(U1435220);北京市科技重大项目(D171100003417002);民航科技重大专项(MHRD20160109)
Foundation items:National Natural Science Foundation of China (U1435220); Beijing Major Science and Technology Projects (D171100003417002); Civil Aviation Science and Technology Major Project (MHRD20160109)
Reference text:

亢良伊,王建飞,刘杰,叶丹.可扩展机器学习的并行与分布式优化算法综述.软件学报,2018,29(1):109-130

KANG Liang-Yi,WANG Jian-Fei,LIU Jie,YE Dan.Survey on Parallel and Distributed Optimization Algorithms for Scalable Machine Learning.Journal of Software,2018,29(1):109-130