具有Fisher一致性的代价敏感Boosting算法
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学基金(61072109,61272280,41271447,61272195);新世纪优秀人才支持计划(NCET-12-0919);西安市科技局项目(CXY1341(6));中央高校基本科研业务费专项资金(K5051203020,K5051203001,K5051303016,K5051303018,K50513100006)


Fisher Consistent Cost Sensitive Boosting Algorithm
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    AdaBoost 是一种重要的集成学习元算法,算法最核心的特性“Boosting”也是解决代价敏感学习问题的有效方法.然而,各种代价敏感Boosting 算法,如AdaCost、AdaC 系列算法、CSB 系列算法等采用启发式策略,向AdaBoost 算法的加权投票因子计算公式或权值调整策略中加入代价参数,迫使算法聚焦于高代价样本.然而,这些启发式策略没有经过理论分析的验证,对原算法的调整破坏了AdaBoost 算法最重要的Boosting 特性。AdaBoost算法收敛于贝叶斯决策,与之相比,这些代价敏感Boosting 并不能收敛到代价敏感的贝叶斯决策.针对这一问题,研究严格遵循Boosting 理论框架的代价敏感Boosting 算法.首先,对分类间隔的指数损失函数以及Logit 损失函数进行代价敏感改造,可以证明新的损失函数具有代价意义下的Fisher 一致性,在理想情况下,优化这些损失函数最终收敛到代价敏感贝叶斯决策;其次,在Boosting 框架下使用函数空间梯度下降方法优化新的损失函数得到算法AsyB以及AsyBL.二维高斯人工数据上的实验结果表明,与现有代价敏感Boosting 算法相比,AsyB 和AsyBL 算法能够有效逼近代价敏感贝叶斯决策;UCI 数据集上的测试结果也进一步验证了AsyB 以及AsyBL 算法能够生成有更低错分类代价的代价敏感分类器,并且错分类代价随迭代呈指数下降.

    Abstract:

    AdaBoost is a meta ensemble learning algorithm. The most important theoretical property behind it is "Boosting", which also plays an important role in cost sensitive learning. However, available cost sensitive Boosting algorithms, such as AdaCost, AdaC1, AdaC2, AdaC3, CSB0, CSB1 and CSB2, are just heuristic. They add cost parameters into voting weight calculation formula or sample weights updating strategy of AdaBoost, so that the algorithms are forced to focus on samples with higher misclassification costs. However, these heuristic modifications have no theoretical foundations. The worst thing is that they break the most important theoretical property of AdaBoost, namely “Boosting”. Compared to AdaBoost which converges to optimal Bayes decision rule, those cost sensitive algorithms do not converge to cost sensitive decision rule. This paper studies the problem of designing cost sensitive Boosting algorithms strictly under Boosting theory. First, two new loss functions are constructed by making exponential loss and logit loss cost sensitive. It can be proved that the new loss functions are Fisher consistent in cost sensitive setting, therefore optimizing them finally leads to cost sensitive Bayes decision rule. Performing gradient decent in functional space to optimize these two loss functions then results in new cost sensitive Boosting algorithms: AsyB and AsyBL. Experimental results on synthetic Gaussian data prove that in comparison with other cost sensitive Boosting algorithms, AsyB and AsyBL always better approximate cost sensitive Bayes decision rule. Experimental results on UCI datasets further prove that AsyB and AsyBL generate better cost sensitive classifiers with lower misclassification costs and the misclassification costs decrease exponentially with iterations.

    参考文献
    相似文献
    引证文献
引用本文

曹莹,苗启广,刘家辰,高琳.具有Fisher一致性的代价敏感Boosting算法.软件学报,2013,24(11):2584-2596

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2013-05-31
  • 最后修改日期:2013-07-17
  • 录用日期:
  • 在线发布日期: 2013-11-01
  • 出版日期:
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号