###
DOI:
Journal of Software:2003.14(5):930-935

离散时间的Hopfield网络稳定性研究
叶世伟,郑宏伟,王文杰,马琳,史忠植
(中国科学院,计算技术研究所,智能信息处理重点实验室,北京,100080;中国科学院,研究生院,信息科学与工程学院,北京,100039;四川师范大学,数学系,四川,成都,610066)
Research on Stability of Discrete Time Hopfield Network
YE Shi-Wei,ZHENG Hong-Wei,WANG Wen-Jie,MA Lin,SHI Zhong-Zhi
()
Abstract
Chart / table
Reference
Similar Articles
Article :Browse 3292   Download 3114
Received:June 04, 2002    Revised:August 16, 2002
> 中文摘要: 主要讨论离散时间连续状态的Hopfield网络模型中当神经元的激活函数为单调增函数(不一定严格单调增)时,并行和串行收敛的充分条件以及具有全局惟一稳定点的充分条件.通过定义新的能量函数和研究单调增函数(不一定严格单调增)的性质,给出了并行和串行收敛的充分条件.通过研究能量函数成为凸函数的条件,将Hopfield 网络的运行看作约束凸优化问题求解,从而得出了仅有全局惟一极小点的充分条件.当网络神经元的自反馈大于该神经元激活函数导数的倒数时,串行运行收敛.当网络连接权值矩阵的最小特征值大于激活函数导数的倒数时,网络并行收敛.如果网络的能量函数为凸函数,则网络将仅有惟一一个全局稳定点.这些结果在应用Hopfield 网络求解优化问题和联想记忆时拓广了神经元激活函数的选择范围.
Abstract:In this paper, the convergent conditions in sequence or parallel update mode and the sufficient condition with only one global stable state for Hopfield network model with discrete time and continuous states when its neurons’ activation function is non-decreasing (not being strictly monotone increasing) are discussed. With the definition of a new energy function and the research on the properties of monotonously increasing function, the sufficient conditions is presented to converge in parallel or sequential update mode when neuron’s activation function is monotonously increasing (not be necessary to strictly increase). After obtained the condition for energy function to be convex with respect to the network states variables, it follows that a sufficient condition for network to have only one stable point with the minimum energy by regarding the operation of Hopfield network as solving a constrained convex optimal problem. When auto-connection weight value of each neuron in network is greater than the reciprocal of derivation of its activation function, the network will be convergent in sequence update mode. When the minimal eigenvalue of connection weights matrix is greater than the reciprocal of derivation of its neuron activation function, the network will be convergent in parallel update mode. If the energy function of network is convex, the network will have only one global stable point. These results extend the choice range of activation function of neuron when using Hopfield net to solution of optimization problem or to associative memory.
文章编号:     中图分类号:    文献标志码:
基金项目:Supported by the Opening Foundation of Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, the Chinese Academy of Sciences under Grant No.IIP 2001-5 (中国科学院计算技术研究所智能信息处理实验室开放基金) Supported by the Opening Foundation of Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, the Chinese Academy of Sciences under Grant No.IIP 2001-5 (中国科学院计算技术研究所智能信息处理实验室开放基金)
Foundation items:
Reference text:

叶世伟,郑宏伟,王文杰,马琳,史忠植.离散时间的Hopfield网络稳定性研究.软件学报,2003,14(5):930-935

YE Shi-Wei,ZHENG Hong-Wei,WANG Wen-Jie,MA Lin,SHI Zhong-Zhi.Research on Stability of Discrete Time Hopfield Network.Journal of Software,2003,14(5):930-935