###
Journal of Software:2020.31(7):2157-2168

一种简单的共享式多层梯度补给方法
杜飞,杨云,胡媛媛,曹丽娟
(云南大学 国家示范性软件学院, 云南 昆明 650504;云南大学 国家示范性软件学院, 云南 昆明 650504;昆明市数据科学与智能计算重点实验室, 云南 昆明 650504;云南省高校数据科学与智能计算重点实验室, 云南 昆明 650504)
Easy Way for Multilayer Gradient Supplies
DU Fei,YANG Yun,HU Yuan-Yuan,CAO Li-Juan
(National Pilot School of Software, Yunnan University, Kunming 650504, China;National Pilot School of Software, Yunnan University, Kunming 650504, China;Kunming Key Laboratory of Data Science and Intelligent Computing, Kunming 650504, China;Yunnan Provincial University Key Laboratory of Data Science and Intelligent Computing, Kunming 650504, China)
Abstract
Chart / table
Reference
Similar Articles
Article :Browse 41   Download 47
Received:November 07, 2017    Revised:March 11, 2018
> 中文摘要: 深度学习通过多层特征提取方式,可以将原始复杂数据自动表征为高级抽象特征,该模型具有很强的建模能力,普遍应用于图像识别语音识别、自然语言处理等高复杂问题中.但深度学习由于网络层数深、参数规模庞大,训练时常常会产生梯度消失、陷入局部最优解、过度拟合等现象.借鉴集成学习的思想,提出一个新颖的深度共享集成网络,该网络通过在深度学习各隐藏层引出多个独立输出层的联合训练的方式,在网络的各层注入梯度,从而对低层隐藏层进行梯度补给,从而降低深度学习中的梯度消失现象,并通过集成多输出层的方式使得整个网络拥有更强的泛化性能.
Abstract:Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These have dramatically improved the state-of-the-art methods in speech recognition, visual object recognition, natural language processing, and many other domains. However, due to the large number of layers and large parameter scales, deep learning often results in gradient vanishing, falling into local optimal solution, overfitting, and so on. By using ensemble learning methods, this study proposes a novel deep sharing ensemble network. Through joint training many independent output layers in each hidden layer and injecting gradients, this network can reduce the gradient vanishing phenomenon, and through ensemble multi-output, it can get a better generalization performance.
文章编号:     中图分类号:TP182    文献标志码:
基金项目:国家自然科学基金(61663046,61876166);云南省应用基础研究计划(2016FB104);云南省中青年学术技术带头人后备人才项目(2017HB005);云南省创新团队项目(2017HC012);云南省高校重点实验室建设计划 国家自然科学基金(61663046,61876166);云南省应用基础研究计划(2016FB104);云南省中青年学术技术带头人后备人才项目(2017HB005);云南省创新团队项目(2017HC012);云南省高校重点实验室建设计划
Foundation items:National Natural Science Foundation of China (61663046, 61876166); Yunnan Applied Fundamental Research Project (2016FB104); Yunnan Provincial Young Academic and Technical Leaders Reserve Talents (2017HB005); Yunnan Provincial Innovation Team (2017HC012); Yunnan Provincial University Key Laboratory Construction Plan Fund
Reference text:

杜飞,杨云,胡媛媛,曹丽娟.一种简单的共享式多层梯度补给方法.软件学报,2020,31(7):2157-2168

DU Fei,YANG Yun,HU Yuan-Yuan,CAO Li-Juan.Easy Way for Multilayer Gradient Supplies.Journal of Software,2020,31(7):2157-2168