MathJax.Hub.Config({tex2jax: {inlineMath: [['$','$'], ['\\(','\\)']]}}); function MyAutoRun() {    var topp=$(window).height()/2; if($(window).height()>450){ jQuery(".outline_switch_td").css({ position : "fixed", top:topp+"px" }); }  }    window.onload=MyAutoRun; $(window).resize(function(){ var bodyw=$win.width(); var _leftPaneInner_width = jQuery(".rich_html_content #leftPaneInner").width(); var _main_article_body = jQuery(".rich_html_content #main_article_body").width(); var rightw=bodyw-_leftPaneInner_width-_main_article_body-25;   var topp=$(window).height()/2; if(rightw<0||$(window).height()<455){ $("#nav-article-page").hide(); $(".outline_switch_td").hide(); }else{ $("#nav-article-page").show(); $(".outline_switch_td").show(); var topp=$(window).height()/2; jQuery(".outline_switch_td").css({ position : "fixed", top:topp+"px" }); } }); 对数-指数形态学联想记忆
  软件学报  2015, Vol. 26 Issue (7): 1662-1674   PDF    
对数-指数形态学联想记忆
冯乃勤1, 4 , 田勇2, 王鲜芳1, 宋黎明1, 范海菊1, 王双喜3    
1. 河南师范大学 计算机与信息工程学院, 河南 新乡 453007;
2. 河南机电职业学院 信息工程系, 河南 郑州 451191;
3. 商丘学院 计算机科学与技术学院, 河南 商丘 476000;
4. 郑州工业应用技术学院 信息工程学院, 河南 郑州 451100
摘要:利用对数和指数算子构建了一种新的形态学联想记忆方法,简称LEMAM.理论分析表明:自联想LEMAM(简称ALEMAM)具有无限存储能力、一步回忆记忆、一定的抵抗腐蚀噪声或膨胀噪声的能力,在输入完全或在一定的噪声范围内,能够保证完全回忆记忆;异联想LEMAM(简称HLEMAM)在输入完全情况下,不能保证完全回忆记忆,但当满足一定条件时,也能够达到完美联想记忆.对比实验结果表明:在一些情况下,LEMAM能够取得较好的联想记忆效果.总体来说,LEMAM丰富了形态学联想记忆的理论和实践,可以作为一种神经计算模型加以研究和利用.
关键词对数    指数    形态学联想记忆    自联想记忆    异联想记忆    完全回忆记忆    
Logarithmic and Exponential Morphological Associative Memories
FENG Nai-Qin1, 4 , TIAN Yong2, WANG Xian-Fang1, SONG Li-Ming1, FAN Hai-Ju1, WANG Shuang-Xi3    
1. College of Computer and Information Engineering, He'nan Normal University, Xinxiang 453007, China;
2. Department of Information Engineering, He'nan Mechanical and Electrical Vocational College, Zhengzhou 451191, China;
3. College of Computer Science and Technology, Shangqiu University, Shangqiu 476000, China;
4. School of Information Engineering, Zhengzhou University of Industrial Technology, Zhengzhou 451100, China
Corresponding author:
Abstract: A novel morphological associative memory method, abbreviated as LEMAM, is constructed by using logarithmic operator and exponential operator. The theoretical analysis shows that auto LEMAM (abbreviated as ALEMAM), which has unlimited storage capacity, one step recall, and a certain ability of resisting erosive noise or dilative noise, can ensure perfect recall memory for either perfect inputs or a certain range of noise. Hetero LEMAM (abbreviated as HLEMAM) does not guarantee perfect recall, even without any input noise. However, when meeting certain conditions, HLEMAM can also achieve perfect recall. HLEMAM contrast experiments show that, in some cases, LEMAM can produce better result. On balance, LEMAM enriches the theory and practice of morphological associative memories, and can serve as a kind of new neural computational model for research and application.
Key words: logarithm    exponent    morphological associative memories    autoassociative memories    heteroassociative memories    perfectrecall    

专家指出,联想记忆(associativememories,简称AM)是人脑的功能,是逻辑思维和形象思维、推理和创新的源泉[1].利用机器实现或部分实现类似人脑的功能,一直是人们追求的目标之一.人工联想记忆有两个平行发展的分支:

·一个是经典联想记忆,如Hopfield联想记忆;

·另一个是形态学联想记忆(morphologicalAM,简称MAM),如实形态联想记忆(real MAM,简称RMAM)[2]、复形态联想记忆(complex MAM,简称CMAM)[3]、模糊形态联想记忆(fuzzy MAM,简称FMAM)[4, 5]、增强模糊形态联想记忆(enhanced FMAM,简称EFMAM)[6]等.

形态学联想记忆是一大类,包含若干子类.从面向对象的观点出发,冯等人提出了一个复域上统一的形态学联想记忆框架(unifiedframework of MAM in complex domain,简称UFMAMCD)[7],将几种基本的MAM统一在一起,旨在丰富和发展MAM的理论,深刻揭示MAM的本质,探索并发现一些新的形态学联想记忆方法.

Hopfield联想记忆网络要求联想记忆的模式或向量之间正交,其存储能力十分有限.已有相关文献表明,它的存储容量不超过网络神经元总量的15%[8],这意味着大量神经元的浪费,且存在收敛问题.而形态学联想记忆是以数学形态学为基础建立起来的,有着严格的理论和方法.自联想形态学记忆具有无限存储能力;一步回忆记忆,不存在收敛问题;具有一定的抵抗腐蚀噪声或膨胀噪声的能力;在输入完全或在一定的噪声条件下,能够保证完全回忆记忆.异联想形态学记忆在输入完全情况下,没有完全回忆记忆的保证,但在一定条件下也可以实现完全回忆记忆.鉴于MAM的突出优点和特点,近年来,它得到了人们的重视,在模式识别和图像处理[9, 10, 11, 12, 13]、感知[14]、学习[15, 16]、分类和预测[17, 18]等领域得到了广泛的应用.

MAM的主要问题之一是:异联想形态学记忆(heteroassociativemorphological memories,简称HMM)是不完全的,即使输入是完全的,它也不能保证完全回忆记忆,这极大地限制了HMM的应用.针对这一问题,Feng等人提出了一种解决方法[19],称为四维存储的形态学联想记忆(FDSHMM),在输入完全或在一定的噪声范围内,可以实现HMM的完全回忆记忆.FDSHMM方法可以应用于小规模问题或实时性要求不高的场合.鉴于当前任何一种MAM方法都不完美,因此,研究一些新的MAM方法,多种方法互补,将是解决HMM问题的有效途径.

另一方面,我们认为,UFMAMCD是一个框架,具有一定的“容量”,该框架容纳了一些已有的MAM方法,但它并未“饱和”,还可能容纳一些框架可纳的、尚未显见的新方法,有待于我们去探索和发现.这些新方法在算子、性能等方面与已有方法明显不同,可与其他MAM方法互补.显然,努力发现UFMAMCD中隐含的新方法,进一步充实框架内容,完善框架体系,是MAM研究中的一个重要课题.

依据上述分析的基本思想,本文利用对数和指数运算构建了一种新的形态学联想记忆方法,简称LEMAM(logarithmic and exponential MAM).理论分析表明:自联想LEMAM(auto LEMAM,简称ALEMAM)具有无限存储能力、一步回忆记忆、一定的抵抗腐蚀噪声或膨胀噪声的能力,在输入完全或在一定的噪声范围内,能够保证完全回忆记忆;异联想LEMAM(heteroLEMAM,简称HLEMAM)在输入完全时,没有完全回忆记忆的保证,但在满足完全回忆记忆定理和噪声定理的条件下,也可以达到完全回忆记忆.对比实验结果表明:在一些情况下,LEMAM能够取得较好的联想记忆效果.在自联想方面,ALEMAM的抗噪声性能明显好于RMAM,而不低于FMAM;在异联想方面,HLEMAM抗混合噪声的能力明显优于HRMAM和HFMAM.Iris实验结果表明:在训练精度和预测分类准确性方面,HLEMAM方法胜过HRMAM,略高于或不低于HFMAM.在Wine实验中,有更好的结果和结论:在学习精度和预测分类准确性方面,HLEMAM均胜过HRMAM和HFMAM.另外,LEMAM的神经计算方法与人耳听觉具有的对数特性相近,在模拟听觉引发的联想记忆和研究听神经的神经计算机制方面,LEMAM可能具有潜在的作用和意义.因此,LEMAM可以作为一种新的形态学神经计算模型加以研究和利用,本文为此奠定了基础并积累了资料.

本文第1节从UFMAMCD的角度简介MAM的基本原理.第2节叙述LEMAM的基本原理.第3节讨论ALEMAM.第4节讨论LEMAM的噪声性能,包括噪声边界条件、ALEMAM和HLEMAM的噪声性能.第5节结合Iris实验和Wine实验展示HLEMAM的分类和预测能力及应用前景.最后,第6节总结全文.

1 UFMAMCD简介

在UFMAMCD中,基本计算基于代数格结构(U,∧, ∨,O),其中,

·U代表非空集合或域,例如U=R,U=R+,或U=C;

·∧和∨分别表示取极小和极大操作;

·O代表在U上封闭的运算,例如O=+或-,是R上的封闭运算;也可以O=×或/,是R+上的封闭运算;还可以O=Log或Exp,构成R>0(限定对数底数和真数>1)上的封闭运算.我们也使用Q表示O的逆运算.

在UFMAMCD中,对象满足以下可纳条件:

(1)有序性:设a,bU,则abba;

(2)封闭性:设a,bU,aOb=r,则rU;

(3)正确性:遵循正确的运算法则.

满足以上条件的MAM对象,对UFMAMCD来说是可纳的.

给定一对模式向量x=(x1,…,xn)'和y=(y1,…,ym)',在输入x作用下,回忆y的形态学联想记忆W被定义为

$W = y\mathop \wedge \limits^{\rm O} x' = \left[ {\begin{array}{*{20}{c}} {{y_1}{\rm O}{x_1}}& \cdots &{{y_1}{\rm O}{x_n}} \\ \vdots & \ddots & \vdots \\ {{y_m}{\rm O}{x_1}}& \cdots &{{y_m}{\rm O}{x_n}} \end{array}} \right]$ (1)

因为W满足下面的等式:

$W\mathop \vee \limits^\Theta x = \left[ {\begin{array}{*{20}{c}} {\mathop \vee \limits_{i = 1}^n ({y_1}{\rm O}{x_i}\Theta {x_i})}\\ \vdots \\ {\mathop \vee \limits_{i = 1}^n ({y_m}{\rm O}{x_i}\Theta {x_i})} \end{array}} \right] = y$ (2)

另一个形态学联想记忆M可定义为

$M = y\mathop \vee \limits^{\rm O} x' = \left[ {\begin{array}{*{20}{c}} {{y_1}{\rm O}{x_1}}& \cdots &{{y_1}{\rm O}{x_n}} \\ \vdots & \ddots & \vdots \\ {{y_m}{\rm O}{x_1}}& \cdots &{{y_m}{\rm O}{x_n}} \end{array}} \right]$ (3)

M也满足下面的等式:

$M\mathop \wedge \limits^\Theta x = \left[ {\begin{array}{*{20}{c}} {\mathop \wedge \limits_{i = 1}^n ({y_1}{\rm O}{x_i}\Theta {x_i})}\\ \vdots \\ {\mathop \wedge \limits_{i = 1}^n ({y_m}{\rm O}{x_i}\Theta {x_i})} \end{array}} \right] = y$ (4)

类似地,设(x1,y1),…,(xk,yk)是k对向量,其中,第x对输入向量和输出向量分别是:

${x^\xi } = (x_1^\xi ,...,x_n^\xi )',{y^\xi } = (y_1^\xi ,...,y_m^\xi )'.$

对于给定的模式联想集合{(xξ,yξ):ξ=1,…,k},我们定义一对联想模式矩阵(X,Y),其中,X=(x1,…,xk),Y=(y1,…,yk).如此,Xnxk维矩阵,Ymxk维矩阵.对矩阵对(X,Y),定义两个自然的形态学mxn记忆WXYMXY如下:

${W_{XY}} = Y\mathop \wedge \limits^{\rm O} X' = \wedge _{\xi = 1}^k\left[ {{y^\xi }\mathop \wedge \limits^{\rm O} ({x^\xi })'} \right]$ (5)
${M_{XY}} = Y\mathop \vee \limits^{\rm O} X' = \vee _{\xi = 1}^k\left[ {{y^\xi }\mathop \vee \limits^{\rm O} ({x^\xi })'} \right]$ (6)

显然,${y^\xi }\mathop \wedge \limits^{\rm O} ({x^\xi })' = {y^\xi }\mathop \vee \limits^{\rm O} ({x^\xi })'$,所以下面的不等式成立:

${W_{XY}} \le {y^\xi }\mathop \wedge \limits^{\rm O} ({x^\xi })' = {y^\xi }\mathop \vee \limits^{\rm O} ({x^\xi })' \le {M_{XY}},\forall \xi = 1,...,k$ (7)

同时,按照公式(2)和公式(4),不等式(7)暗含着:

${W_{XY}}\mathop \vee \limits^\Theta {x^\xi } \le \left[ {{y^\xi }\mathop \wedge \limits^{\rm O} ({x^\xi })'} \right]\mathop \vee \limits^\Theta {x^\xi } = {y^\xi } = \left[ {{y^\xi }\mathop \vee \limits^{\rm O} ({x^\xi })'} \right]\mathop \wedge \limits^\Theta {x^\xi } \le {M_{XY}}\mathop \wedge \limits^\Theta {x^\xi },\forall \xi = 1,...,k$ (8)

等价地,有:

${W_{XY}}\mathop \vee \limits^\Theta X \le Y \le {M_{XY}}\mathop \wedge \limits^\Theta X$ (9)
2 对数-指数形态学联想记忆(LEMAM)

受统一的形态学联想记忆框架的启发[7],我们用对数和指数运算分别代替RMAM中的减和加运算,进而得到一种新的基于对数和指数运算的形态学联想记忆,即LEMAM.

由统一的形态学联想记忆框架可知,其基本计算基于格代数结构(U,∧,∨,O),其中,U代表非空集合或域,O代表在U上封闭的运算.假设U=R,O代表在U上的取对数/求指数运算(Log/Exp运算),那么(U,∧,∨,O)=(R,∧,∨,Log/Exp)即构成LEMAM的计算基础.为方便起见,在以下的讨论中,对于对数函数y=logax,我们假定a>1,x>1,以后需要时,再加以扩展并另行讨论.

给定一对向量$x = ({x_1},...,{x_n})' \in R_{ > 1}^n$和$y = ({y_1},...,{y_m})' \in R_{ > 1}^m$,当对LEMAM网络提供输入向量x时,可将回忆向量y的对数-指数形态学联想记忆V如下定义:

$V = y\mathop \wedge \limits^{\log } x' = \left( {\begin{array}{*{20}{c}} {{{\log }_{{x_1}}}{y_1}}& \ldots &{{{\log }_{{x_n}}}{y_1}} \\ \vdots & \ddots & \vdots \\ {{{\log }_{{x_1}}}{y_m}}& \cdots &{{{\log }_{{x_n}}}{y_m}} \end{array}} \right)$ (10)

因为V满足方程$V\mathop \vee \limits^{exp} x = y,$即有:

$V\mathop \vee \limits^{\exp } x = \left( {\begin{array}{*{20}{c}} {\mathop \vee \limits_{i = 1}^n x_i^{{{\log }_{{x_i}}}^{{y_{_1}}}}}\\ \vdots \\ {\mathop \vee \limits_{i = 1}^n {x_i}^{{{\log }_{{x_i}}}^{{y_m}}}} \end{array}} \right) = y$ (11)

同样,也可以使用极大算子$\mathop \vee \limits^{\log } $定义yx的另一个形态学联想记忆T:

$T = y\mathop \vee \limits^{\log } x' = \left( {\begin{array}{*{20}{c}} {{{\log }_{{x_1}}}{y_1}}& \ldots &{{{\log }_{{x_n}}}{y_1}} \\ \vdots & \ddots & \vdots \\ {{{\log }_{{x_1}}}{y_m}}& \cdots &{{{\log }_{{x_n}}}{y_m}} \end{array}} \right)$ (12)

T满足方程$T\mathop \wedge \limits^{\exp } x = y,$即有:

$T\mathop \wedge \limits^{\exp } x = \left( {\begin{array}{*{20}{c}} {\mathop \wedge \limits_{i = 1}^n x_i^{{{\log }_{{x_i}}}^{{y_1}}}} \\ \vdots \\ {\mathop \wedge \limits_{i = 1}^n x_i^{{{\log }_{{x_i}}}^{{y_m}}}} \end{array}} \right) = y$ (13)

类似地,设(x1,y1),…,(xk,yk)是k个向量对,输入向量${x^l} = (x_1^l,...,x_n^l)' \in R_{ > 1}^n$,输出向量${y^l} = (y_1^l,...,y_m^l)' \in R_{ > 1}^m$,l=1,…,k.借助某种变换,可将输入向量和输出向量限定在R>1上.对于一个给定的模式联想集合{(xl,yl):l=1,…,k},可以定义一对联想模式矩阵(X,Y),其中,X=(x1,…,xk),Y=(y1,…,yk).这样,Xnxk维矩阵,它的第i,j项是$x_i^j;$Ymxk维矩阵,它的第i,j项是$y_i^j.$对(X,Y)定义如下两个自然的形态学mxn维记忆VXYTXY:

${V_{XY}} = \mathop \wedge \limits_{l = 1}^k \left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right]$ (14)
${T_{XY}} = \mathop \vee \limits_{l = 1}^k \left[ {{y^l}\mathop \vee \limits^{\log } ({x^l})'} \right]$ (15)

VXYTXY中对应的第i,j项分别由下列式子表示:

${v_{ij}} = \mathop \wedge \limits_{l = 1}^k ({\log _{x_j^l}}y_i^l) = ({\log _{x_j^1}}y_i^1) \wedge ... \wedge ({\log _{x_j^k}}y_i^k)$ (16)
${t_{ij}} = \mathop \vee \limits_{l = 1}^k ({\log _{x_j^l}}y_i^l) = ({\log _{x_j^1}}y_l^1) \vee ... \vee ({\log _{x_j^k}}y_i^k)$ (17)

很明显,${y^l}\mathop \wedge \limits^{\log } ({x^l})' = {y^l}\mathop \vee \limits^{\log } ({x^l})'.$由此定义,下式成立:

${V_{XY}} \le {y^l}\mathop \wedge \limits^{\log } ({x^l})' = {y^l}\mathop \vee \limits^{\log } ({x^l})' \le {T_{XY}},\forall l = 1,...,k$ (18)

根据公式(11)、公式(13)~公式(15),不等式(18)隐含着:

${V_{XY}}\mathop \vee \limits^{\exp } {x^l} \le \left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right]\mathop \vee \limits^{\exp } {x^l} = {y^l} = \left[ {{y^l}\mathop \vee \limits^{\log } ({x^l})'} \right]\mathop \wedge \limits^{\exp } {x^l} \le {T_{XY}}\mathop \wedge \limits^{\exp } {x^l},\forall l = 1,...,k$ (19)

或者,等价地,有:

${V_{XY}}\mathop \vee \limits^{\exp } X \le Y \le {T_{XY}}\mathop \wedge \limits^{\exp } X$ (20)

如果${V_{XY}}\mathop \vee \limits^{\exp } X = Y,$则VXY称为对(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆;如果${T_{XY}}\mathop \wedge \limits^{\exp } X = Y,$则TXY称为对(X,Y)的$\mathop \wedge \limits^{\exp } $完全回忆记忆.

定理1. 如果A是对(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆,B是对(X,Y)的$\mathop \wedge \limits^{\exp } $完全回忆记忆,那么,

AVXYTXYB,且${V_{XY}}\mathop \vee \limits^{\exp } X = Y = {T_{XY}}\mathop \wedge \limits^{\exp } X$ (21)

证明:如果A是对(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆,那么,${(A\mathop \vee \limits^{\exp } {x^l})_i} = y_i^l$,∀l=1,…,k,∀i=1,…,m.等价地,有:

$\mathop \vee \limits_{j = 1}^n {(x_j^l)^{{a_{ij}}}} = y_i^l\begin{array}{*{20}{c}} , \end{array}\forall l = 1,...,k$和$\forall i = 1,...,m.$

对于任意索引j∈{1,…,n},它满足${(x_j^l)^{{a_{ij}}}} \le y_i^l,\forall l = 1,...,k.$由此,对该不等式两边取对数,我们有:

${\log _{x_j^l}}{(x_j^l)^{{a_{ij}}}} \le {\log _{x_j^l}}y_i^l,\forall l = 1,...,k \Leftrightarrow {a_{ij}} \le {\log _{x_j^l}}y_i^l,\forall l = 1,...,k \Leftrightarrow {a_{ij}} \le \mathop \wedge \limits_{l = 1}^k ({\log _{x_j^l}}y_i^l) = {v_{ij}}.$

这表明AVXY,按照公式(20),我们有$Y = A\mathop \vee \limits^{\exp } X \le {V_{XY}}\mathop \vee \limits^{\exp } X \le Y.$因此,${V_{XY}}\mathop \vee \limits^{\exp } X = Y.$

类似地,可以证明:如果B是对于(X,Y)的$\mathop \wedge \limits^{\exp } $完全回忆记忆,那么TXYB,且${T_{XY}}\mathop \wedge \limits^{\exp } X = Y.$这样,由公式(18),我们有:AVXYTXYB,${V_{XY}}\mathop \vee \limits^{\exp } X = Y = {T_{XY}}\mathop \wedge \limits^{\exp } X.$ □

该定理表明:VXY是所有$\mathop \vee \limits^{\exp } $完全回忆记忆的最小上界,TXY是所有$\mathop \wedge \limits^{\exp } $完全回忆记忆的最大下界.

然而值得注意的是,定理1并没有说明(X,Y)的完全回忆记忆是否存在.那么,在什么条件下,VXYTXY才能成为(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆或$\mathop \wedge \limits^{\exp } $完全回忆记忆呢?定理2将解答这一问题.

定理2(完全回忆记忆定理). VXY是对(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆,当且仅当对每个l=1,…,k,矩阵$\left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right] - $ VXY的每一行包含一个零项;类似地,TXY是对(X,Y)的$\mathop \wedge \limits^{\exp } $完全回忆记忆,当且仅当对每一个l=1,…,k,矩阵${T_{XY}} - \left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right]$的每一行都包含一个零项.

证明:∀l=1,…,k和∀i=1,…,m,VXY是对(X,Y)的$\mathop \vee \limits^{\exp } $完全回忆记忆

$\begin{array}{l} \Leftrightarrow {\left( {{V_{XY}}\mathop \vee \limits^{\exp } {x^l}} \right)_i} = y_i^l \Leftrightarrow \frac{{y_i^l}}{{{{\left( {{V_{XY}}\mathop \vee \limits^{\exp } {x^l}} \right)}_i}}} = 1 \Leftrightarrow \frac{{y_i^l}}{{\mathop \vee \limits_{j = 1}^n {{(x_j^l)}^{{v_{ij}}}}}} = 1 \Leftrightarrow \mathop \wedge \limits_{j = 1}^n \frac{{y_i^l}}{{{{(x_j^l)}^{{v_{ij}}}}}} = 1\\ \Leftrightarrow \mathop \wedge \limits_{j = 1}^n ({\log _{x_j^l}}y_i^l - {v_{ij}}) = {\log _{x_j^l}}1 = 0 \Leftrightarrow \mathop \wedge \limits_{j = 1}^n {\left( {\left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right] - {V_{XY}}} \right)_{ij}} = 0. \end{array}$

这最后的等式为真,当且仅当对每一个l=1,…,k和每一个i=1,…,m,矩阵$\left[ {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right] - {V_{XY}}$的每一行至少包含一个零项.在此,我们只针对记忆VXY进行证明,另一半证明可用类似的方法得到. □

推论1. ${V_{XY}}\mathop \vee \limits^{\exp } X = Y,$当且仅当对每一个行索引i=1,…,m和每一个g∈{1,…,k},存在一个依赖于ig的列索引j∈{1,…,n},使得:

$x_j^\gamma = \mathop \vee \limits_{l = 1}^k y_i^\gamma $$^{{{\log }_{y_i^l}}x_j^l}$ (22)

类似地,${T_{XY}}\mathop \wedge \limits^{\exp } X = Y,$当且仅当对每一个行索引i=1,…,m和每一个g∈{1,…,k},存在一个依赖于ig的列索引j∈{1,…,n},使得:

$x_j^\gamma = \mathop \wedge \limits_{l = 1}^k y_i^\gamma $$^{{{\log }_{y_i^l}}x_j^l}$ (23)

证明:按照定理2,${V_{XY}}\mathop \vee \limits^{\exp } {x^l} = {y^l},$∀l=1,…,k,当且仅当对每一个行索引i=1,…,m和每一个g∈{1,…,k},存在一个依赖于ig的列索引j∈{1,…,n},使得:

${\log _{x_j^\gamma }}y_i^\gamma = {v_{ij}} = \mathop \wedge \limits_{l = 1}^k {\log _{x_j^l}}y_i^l$ (24)

该方程成立,当且仅当:

${\log _{y_i^\gamma }}x_j^\gamma = \mathop \vee \limits_{l = 1}^k {\log _{y_i^l}}x_j^l$ (25)

等价地,当且仅当:

$x_j^\gamma = \mathop \vee \limits_{l = 1}^k y_i^\gamma $$^{{{\log }_{y_i^l}}x_j^l}$ (26)

例1:设${x^1} = \left( {\begin{array}{*{20}{c}} 4\\ 2\\ 4 \end{array}} \right),{y^1} = \left( {\begin{array}{*{20}{c}} {16}\\ 2\\ 4 \end{array}} \right);{x^2} = \left( {\begin{array}{*{20}{c}} 3\\ 3\\ 9 \end{array}} \right),{y^2} = \left( {\begin{array}{*{20}{c}} 9\\ 3\\ 3 \end{array}} \right);{x^3} = \left( {\begin{array}{*{20}{c}} 5\\ {25}\\ {125} \end{array}} \right),{y^3} = \left( {\begin{array}{*{20}{c}} {25}\\ {25}\\ 5 \end{array}} \right).$

·在记忆阶段:

$\begin{gathered} {V_{XY}} = \mathop \wedge \limits_{l = 1}^3 \left( {{y^l}\mathop \wedge \limits^{\log } ({x^l})'} \right) \hfill \\ {\text{ }} = \left( {\begin{array}{*{20}{c}} {16} \\ 2 \\ 4 \end{array}} \right)\mathop \wedge \limits^{\log } {\left( {\begin{array}{*{20}{c}} 4 \\ 2 \\ 4 \end{array}} \right)^\prime } \wedge \left( {\begin{array}{*{20}{c}} 9 \\ 3 \\ 3 \end{array}} \right)\mathop \wedge \limits^{\log } {\left( {\begin{array}{*{20}{c}} 3 \\ 3 \\ 9 \end{array}} \right)^\prime } \wedge \left( {\begin{array}{*{20}{c}} {25} \\ {25} \\ 5 \end{array}} \right)\mathop \wedge \limits^{\log } {\left( {\begin{array}{*{20}{c}} 5 \\ {25} \\ {125} \end{array}} \right)^\prime } \hfill \\ {\text{ }} = \left( {\begin{array}{*{20}{c}} 2&4&2 \\ {\frac{1}{2}}&1&{\frac{1}{2}} \\ 1&2&1 \end{array}} \right) \wedge \left( {\begin{array}{*{20}{c}} 2&2&1 \\ 1&1&{\frac{1}{2}} \\ 1&1&{\frac{1}{2}} \end{array}} \right) \wedge \left( {\begin{array}{*{20}{c}} 2&1&{\frac{2}{3}} \\ 2&1&{\frac{2}{3}} \\ 1&{\frac{1}{2}}&{\frac{1}{3}} \end{array}} \right) \hfill \\ {\text{ }} = \left( {\begin{array}{*{20}{c}} 2&1&{\frac{2}{3}} \\ {\frac{1}{2}}&1&{\frac{1}{2}} \\ 1&{\frac{1}{2}}&{\frac{1}{3}} \end{array}} \right). \hfill \\ \end{gathered} $

·在回忆阶段,应用VXY,可得:

$\begin{gathered} {V_{XY}}\mathop \vee \limits^{\exp } {x^1} = \left( {\begin{array}{*{20}{c}} 2&1&{\frac{2}{3}} \\ {\frac{1}{2}}&1&{\frac{1}{2}} \\ 1&{\frac{1}{2}}&{\frac{1}{3}} \end{array}} \right)\mathop \vee \limits^{\exp } \left( {\begin{array}{*{20}{c}} 4 \\ 2 \\ 4 \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {{4^2} \vee {2^1} \vee {4^{\frac{2}{3}}}} \\ {{4^{\frac{1}{2}}} \vee {2^1} \vee {4^{\frac{1}{2}}}} \\ {{4^1} \vee {2^{\frac{1}{2}}} \vee {4^{\frac{1}{3}}}} \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {16} \\ 2 \\ 4 \end{array}} \right) = {y^1}, \hfill \\ {V_{XY}}\mathop \vee \limits^{\exp } {x^2} = \left( {\begin{array}{*{20}{c}} 2&1&{\frac{2}{3}} \\ {\frac{1}{2}}&1&{\frac{1}{2}} \\ 1&{\frac{1}{2}}&{\frac{1}{3}} \end{array}} \right)\mathop \vee \limits^{\exp } \left( {\begin{array}{*{20}{c}} 3 \\ 3 \\ 9 \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {{3^2} \vee {3^1} \vee {9^{\frac{2}{3}}}} \\ {{3^{\frac{1}{2}}} \vee {3^1} \vee {9^{\frac{1}{2}}}} \\ {{3^1} \vee {3^{\frac{1}{2}}} \vee {9^{\frac{1}{3}}}} \end{array}} \right) = \left( {\begin{array}{*{20}{c}} 9 \\ 3 \\ 3 \end{array}} \right) = {y^2}, \hfill \\ {V_{XY}}\mathop \vee \limits^{\exp } {x^3} = \left( {\begin{array}{*{20}{c}} 2&1&{\frac{2}{3}} \\ {\frac{1}{2}}&1&{\frac{1}{2}} \\ 1&{\frac{1}{2}}&{\frac{1}{3}} \end{array}} \right)\mathop \vee \limits^{\exp } \left( {\begin{array}{*{20}{c}} 5 \\ {25} \\ {125} \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {{5^2} \vee {{25}^1} \vee {{125}^{\frac{2}{3}}}} \\ {{5^{\frac{1}{2}}} \vee {{25}^1} \vee {{125}^{\frac{1}{2}}}} \\ {{5^1} \vee {{25}^{\frac{1}{2}}} \vee {{125}^{\frac{1}{3}}}} \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {25} \\ {25} \\ 5 \end{array}} \right) = {y^3}. \hfill \\ \end{gathered}$

应用TXY回忆,也可得到对原始模式对的完全回忆记忆,读者可自行验证.

3 自联想对数-指数形态学联想记忆ALEMAM

在对数-指数形态学联想记忆中,如果X=Y,即xl=yl,∀l=1,…,k,那么VXY=VXX,TXY=TXX.此时的LEMAM就成为自联想LEMAM,即ALEMAM.

定理3. ${V_{XX}}\mathop \vee \limits^{\exp } X = X,{T_{XX}}\mathop \wedge \limits^{\exp } X = X.$

证明:记忆时,∀i∈{1,…,m}和∀l=1,…,k,我们有:

${\left[ {{x^l}\mathop \wedge \limits^{\log } ({x^l})'} \right]_{ii}} = {\log _{x_i^l}}x_i^l = 1,{v_{ii}} = \mathop \wedge \limits_{l = 1}^k {\left[ {{x^l}\mathop \wedge \limits^{\log } ({x^l})'} \right]_{ii}} = 1$ (27)

因此,对于每个l=1,…,k,$\left[ {{x^l}\mathop \wedge \limits^{\log } ({x^l})'} \right] - {V_{XY}}$的每一行都包含了一个零项.按照定理2,VXX是对(X,X)的$\mathop \vee \limits^{\exp } $完全回忆记忆,即,${V_{XX}}\mathop \vee \limits^{\exp } X = X$成立.同理可证${T_{XX}}\mathop \wedge \limits^{\exp } X = X.$ □

在定理3中,对X的类型和规模大小是没有限制的,这表明记忆VXXTXX具有无限存储能力,在输入完全情况下,能够保证完全回忆记忆.

例2:设${x^1} = \left( {\begin{array}{*{20}{c}} 9 \\ 3 \\ {81} \end{array}} \right),{y^1} = \left( {\begin{array}{*{20}{c}} 9 \\ 3 \\ {81} \end{array}} \right);{x^2} = \left( {\begin{array}{*{20}{c}} 2 \\ 4 \\ 8 \end{array}} \right),{y^2} = \left( {\begin{array}{*{20}{c}} 2 \\ 4 \\ 8 \end{array}} \right);{x^3} = \left( {\begin{array}{*{20}{c}} 4 \\ 4 \\ 4 \end{array}} \right),{y^3} = \left( {\begin{array}{*{20}{c}} 4 \\ 4 \\ 4 \end{array}} \right).$

采用LEMAM方法的VXXTXX均能实现对上述模式对的完全回忆记忆.限于篇幅,计算从略.

定理4. 如果${V_{XX}}\mathop \vee \limits^{\exp } z = w,{T_{XX}}\mathop \wedge \limits^{\exp } z = u,$那么${V_{XX}}\mathop \vee \limits^{\exp } w = w,{T_{XX}}\mathop \wedge \limits^{\exp } u = u.$

证明:假定${V_{XX}}\mathop \vee \limits^{\exp } z = w,$注意到:对于i=1,…,n,vii=1,那么,

${\left( {{V_{XX}}\mathop \vee \limits^{\exp } w} \right)_i} = \mathop \vee \limits_{j = 1}^n w_j^{{v_{ii}}} \ge w_i^{{v_{ii}}} = {w_i} \Leftrightarrow w \le {V_{XX}}\mathop \vee \limits^{\exp } w$ (28)

i,j,q∈{1,…,n},注意到:

${v_{i\theta }} = \mathop \wedge \limits_{l = 1}^k {\log _{x_\theta ^l}}x_i^l \le {\log _{x_\theta ^\gamma }}x_i^\gamma {\rm{,}}\forall \gamma = 1,...,k$ (29)
${v_{\theta j}} = \mathop \wedge \limits_{l = 1}^k {\log _{x_j^l}}x_\theta ^l \le {\log _{x_j^\gamma }}x_\theta ^\gamma {\rm{,}}\forall \gamma = 1,...,k$ (30)

因而:

${v_{i\theta }} \cdot {v_{\theta j}} \le {\log _{x_\theta ^\gamma }}x_i^\gamma \cdot {\log _{x_j^\gamma }}x_\theta ^\gamma = {\log _{x_j^\gamma }}x_i^\gamma ,\forall \gamma = 1,...,k$ (31)

所以${v_{i\theta }}{v_{\theta j}} \le \mathop \wedge \limits_{l = 1}^k {\log _{x_j^l}}x_i^l = {v_{ij}}.$从而,对i=1,…,n,我们有:

$\begin{array}{l} {w_i} = \mathop \vee \limits_{j = 1}^n (z_j^{{v_{ij}}}) \ge \mathop \vee \limits_{j = 1}^n (z_j^{{v_{i\theta }} \cdot {v_{\theta j}}}),\forall \theta = 1,...,n\\ {\rm{ }} \ge \mathop \vee \limits_{l = 1}^n \mathop \vee \limits_{j = 1}^n (z_j^{{v_{il}} \cdot {v_{lj}}}) = \mathop \vee \limits_{l = 1}^n {\left( {\mathop \vee \limits_{j = 1}^n z_j^{{v_{lj}}}} \right)^{{v_{il}}}} = \mathop \vee \limits_{l = 1}^n w_l^{{v_{il}}} = {\left( {{V_{XX}}\mathop \vee \limits^{\exp } w} \right)_i} \end{array}$ (32)

这表明:

$w \ge {V_{XX}}\mathop \vee \limits^{\exp } w$ (33)

由不等式(28)和不等式(33)可知,${V_{XX}}\mathop \vee \limits^{\exp } w = w.$ □

定理4表明,VXXTXX可在一步内对完全输入进行完全回忆.相比之下,Hopfield联想记忆网络的存储能力十分有限,要求记忆模式正交,回忆分多步完成.这一事实表明,ALEMAM与经典联想记忆相比具有明显优势.

4 LEMAM的噪声性能 4.1 噪声定理

定理5(噪声定理). 令X=(x1,…,xk),Y=(y1,…,yk),设${\tilde x^l}$表示输入模式xl的一个含噪声样本,那么,${V_{XY}}\mathop \vee \limits^{\exp } {\tilde x^l} = {y^l}$当且仅当:

$\tilde x_j^l \le x_j^l \vee \mathop \wedge \limits_{i = 1}^m \left( {\mathop \vee \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right){\rm{,}}\forall j = 1,...,n$ (34)

同时,对每一个行索引i∈{1,…,m},存在一个列索引ji∈{1,…,n},使得:

$\tilde x_{{j_i}}^l = x_{{j_i}}^l \vee \left( {\mathop \vee \limits_{\xi \ne l} {{(x_{{j_i}}^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right)$ (35)

类似地,${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} = {y^l}$当且仅当:

$\tilde x_j^l \ge x_j^l \wedge \mathop \vee \limits_{i = 1}^m \left( {\mathop \wedge \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right){\rm{,}}\forall j = 1,...,n$ (36)

同时,对每一个行索引i∈{1,…,m},存在一个列索引ji∈{1,…,n},使得:

$\tilde x_{{j_i}}^l = x_{{j_i}}^l \wedge \left( {\mathop \wedge \limits_{\xi \ne l} {{(x_{{j_i}}^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right)$ (37)

证明:在证明定理5之前,首先介绍一个指数对数的常用公式:

${a^{{{\log }_b}c}} = {c^{{{\log }_b}a}}$ (38)

令${a^{{{\log }_b}c}} = A \Leftrightarrow {\log _b}c = {\log _a}A = \frac{{{{\log }_c}A}}{{{{\log }_c}a}} \Leftrightarrow {\log _c}A = {\log _c}a \cdot {\log _b}c = {\log _b}a$,即$A = {c^{{{\log }_b}a}},$

(1) 设${\tilde x^l}$表示模式xl的一个含噪声样本,对于l=1,…,k,${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} = {y^l}.$那么,

$y_i^l = {\left( {{T_{XY}}\mathop \wedge \limits^{\exp } {{\tilde x}^l}} \right)_i} = \mathop \wedge \limits_{r = 1}^n {(\tilde x_r^l)^{{t_{ir}}}} \le {(\tilde x_j^l)^{{t_{ij}}}},\forall i = 1,...,m$和$\forall j = 1,...,n$ (39)

即:

$\begin{array}{l} {\log _{\tilde x_j^l}}y_i^l \le {\log _{\tilde x_j^l}}{(\tilde x_j^l)^{{t_{ij}}}} = {t_{ij}},\forall i = 1,...,m,\forall j = 1,...,n \Leftrightarrow \\ \frac{{{t_{ij}}}}{{{{\log }_{\tilde x_j^l}}y_i^l}} \ge 1,\forall i = 1,...,m,\forall j = 1,...,n \Leftrightarrow \\ {t_{ij}} \times {\log _{y_i^l}}\tilde x_j^l \ge 1 \Leftrightarrow {(\tilde x_j^l)^{{t_{ij}}}} \ge (y_i^l) \Leftrightarrow \tilde x_j^l \ge {(y_i^l)^{1/{t_{ij}}}}{\rm{,}}\forall i = 1,...,m,\forall j = 1,...,n \Leftrightarrow \\ \tilde x_j^l \ge \mathop \vee \limits_{i = 1}^m {(y_i^l)^{1/{t_{ij}}}} = \mathop \vee \limits_{i = 1}^m {(y_i^l)^{\frac{1}{{\mathop \vee \limits_{\xi = 1}^k {{\log }_{x_j^\xi }}y_i^\xi }}}} = \mathop \vee \limits_{i = 1}^m {(y_i^l)^{\mathop \wedge \limits_{\xi = 1}^k {{\log }_{y_i^\xi }}x_j^\xi }}{\rm{,}}\forall j = 1,...,n \Leftrightarrow \\ \tilde x_j^l \ge \mathop \vee \limits_{i = 1}^m {(y_i^l)^{\left( {\mathop \wedge \limits_{\xi \ne l} {{\log }_{y_i^\xi }}x_j^\xi } \right) \wedge {{\log }_{y_i^l}}x_j^l}}{\rm{,}}\forall j = 1,...,n \Leftrightarrow \\ \tilde x_j^l \ge \mathop \vee \limits_{i = 1}^m \left[ {{{(y_i^l)}^{\mathop \wedge \limits_{\xi \ne l} {{\log }_{y_i^\xi }}x_j^\xi }} \wedge x_j^l} \right] = x_j^l \wedge \mathop \vee \limits_{i = 1}^m \left[ {\mathop \wedge \limits_{\xi \ne l} {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_j^\xi }}} \right]{\rm{,}}\forall j = 1,...,n. \end{array}$

根据公式(38)得到:

$\begin{array}{l} \tilde x_j^l \ge x_j^l \wedge \mathop \vee \limits_{i = 1}^m \left[ {\mathop \wedge \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right]{\rm{,}}\forall j = 1,...,n \Leftrightarrow \\ \tilde x_j^l \ge x_j^l \wedge \left[ {\mathop \wedge \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right],\forall j = 1,...,n,\forall i = 1,...,m \end{array}$ (40)

这表明,不等式(36)被满足.

设对于i=1,…,m,不等式集合(36)不包含等号,即,假定存在一个行索引i∈{1,…,m},使得:

$\tilde x_j^l > x_j^l \wedge \left[ {\mathop \wedge \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right] = x_j^l \wedge \left[ {\mathop \wedge \limits_{\xi \ne l} {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_j^\xi }}} \right],\forall j = 1,...,n$ (41)

那么,

$\begin{array}{l} {\left( {{T_{XY}}\mathop \wedge \limits^{\exp } {{\tilde x}^l}} \right)_i} = \mathop \wedge \limits_{j = 1}^n {(\tilde x_j^l)^{{t_{ij}}}} > \mathop \wedge \limits_{j = 1}^n {\left[ {x_j^l \wedge \left( {\mathop \wedge \limits_{\xi \ne l} {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_j^\xi }}} \right)} \right]^{{t_{ij}}}} = \mathop \wedge \limits_{j = 1}^n {\left[ {\mathop \wedge \limits_{\xi = 1}^k {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_j^\xi }}} \right]^{{t_{ij}}}}\\ {\rm{ }} = \mathop \wedge \limits_{j = 1}^n {\left[ {{{(y_i^l)}^{\mathop \wedge \limits_{\xi = 1}^k {{\log }_{y_i^\xi }}x_j^\xi }}} \right]^{{t_{ij}}}} = \mathop \wedge \limits_{j = 1}^n {\left[ {{{(y_i^l)}^{\frac{1}{{\mathop \vee \limits_{\xi = 1}^k {{\log }_{x_j^\xi }}y_i^\xi }}}}} \right]^{{t_{ij}}}} = \mathop \wedge \limits_{j = 1}^n {[{(y_i^l)^{1/{t_{ij}}}}]^{{t_{ij}}}} = y_i^l \end{array}$ (42)

因此,${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} > {y^l},$这与${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} = {y^l}$相矛盾.从而,对每一行索引i必然存在一个列索引ji,满足公式(37).

(2) 设$\tilde x_j^l \ge x_j^l \wedge \mathop \vee \limits_{i = 1}^m \left[ {\mathop \wedge \limits_{\xi \ne l} {{(x_j^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right]{\rm{,}}\forall j = 1,...,n$,由第(1)部分的证明可知,该不等式为真当且仅当:

$\tilde x_j^l \ge {(y_i^l)^{1/{t_{ij}}}},\forall i = 1,...,m,\forall j = 1,...,n.$

或者,等价地,当且仅当:

$\begin{array}{l} {(\tilde x_j^l)^{{t_{ij}}}} \ge y_i^l,\forall i = 1,...,m,\forall j = 1,...,n \Leftrightarrow \\ \mathop \wedge \limits_{j = 1}^n {(\tilde x_j^l)^{{t_{ij}}}} \ge y_i^l,\forall i = 1,...,m \Leftrightarrow \\ {\left( {{T_{XY}}\mathop \wedge \limits^{\exp } {{\tilde x}^l}} \right)_i} \ge y_i^l,\forall i = 1,...,m \end{array}$ (43)

这意味着${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} \ge {y^l},$∀l=1,…,k.

如果我们能够证明${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} \le {y^l},$∀l=1,…,k,那么就能够证明${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} = {y^l},$∀l=1,…,k.

任意选择l∈{1,…,k},i∈{1,…,m},则有,

$\begin{array}{l} {\left( {{T_{XY}}\mathop \wedge \limits^{\exp } {{\tilde x}^l}} \right)_i} = \mathop \wedge \limits_{j = 1}^n {(\tilde x_j^l)^{{t_{ij}}}} \le {(\tilde x_{{j_i}}^l)^{{t_{i{j_i}}}}} = {\left[ {x_{{j_i}}^l \wedge \left( {\mathop \wedge \limits_{\xi \ne l} {{(x_{{j_i}}^\xi )}^{{{\log }_{y_i^\xi }}y_i^l}}} \right)} \right]^{{t_{i{j_i}}}}} = {\left( {x_{{j_i}}^l \wedge \left[ {\mathop \wedge \limits_{\xi \ne l} {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_{{j_i}}^\xi }}} \right]} \right)^{{t_{i{j_i}}}}}\\ {\rm{ }} = {\left[ {\mathop \wedge \limits_{\xi = 1}^k {{(y_i^l)}^{{{\log }_{y_i^\xi }}x_{{j_i}}^\xi }}} \right]^{{t_{i{j_i}}}}} = {\left[ {{{(y_i^l)}^{\mathop \wedge \limits_{\xi = 1}^k {{\log }_{y_i^\xi }}x_{{j_i}}^\xi }}} \right]^{{t_{i{j_i}}}}} = {\left[ {{{(y_i^l)}^{\frac{1}{{\mathop \vee \limits_{\xi = 1}^k {{\log }_{x_{{j_i}}^\xi }}y_i^\xi }}}}} \right]^{{t_{i{j_i}}}}} = {\left( {{{(y_i^l)}^{\frac{1}{{{t_{i{j_i}}}}}}}} \right)^{{t_{i{j_i}}}}} = y_i^l \end{array}$ (44)

这表明,${T_{XY}}\mathop \wedge \limits^{\exp } {\tilde x^l} \le {y^l},$∀l=1,…,k. □

定理5表明:VXYTXY具有一定抵抗腐蚀噪声和膨胀噪声的能力,且对于那些受到一定干扰和破坏但仍然能够做到完全回忆记忆的输入模式,在数量上给出了一个边界条件.

4.2 ALEMAM 的噪声性能

在一些例子的实验中,ALEMAM的噪声鲁棒性明显好于Auto RMAM(ARMAM),而不低于Auto FMAM (AFMAM).

例3:对于如下输入模式矩阵X′及其两个噪声模式矩阵XX′′,即:

$X = \left( {\begin{array}{*{20}{c}} {45}&{55}&{51}&{63} \\ {23}&{23}&{37}&{33} \\ {13}&{40}&{15}&{60} \\ 3&{13}&4&{25} \end{array}} \right),X' = \left( {\begin{array}{*{20}{c}} {45}&{55}&5&6 \\ {23}&2&{37}&3 \\ 1&{40}&{15}&6 \\ 3&{13}&4&{25} \end{array}} \right),X'' = \left( {\begin{array}{*{20}{c}} {1000}&{55}&{51}&{63} \\ {23}&{23}&{370}&{33} \\ {130}&{40}&{15}&{60} \\ 3&{13}&4&{250} \end{array}} \right).$

实验结果表明:ALEMAM的VXXXX'完全回忆记忆,TXXXX''完全回忆记忆.AFMAM的实验结果与ALEMAM相同,即:AFMAM的AXXXX'完全回忆记忆,BXXXX''完全回忆记忆.但ARMAM的WXXMXX仅能对X完全回忆记忆,这表明ALEMAM的抗噪声性能较好.

4.3 HLEMAM的噪声性能

异联想记忆是更加普遍的联想记忆,应该说,它比自联想记忆用途更加广泛,自联想记忆仅是异联想记忆的特例.然而,已有的Hetero RMAM(HRMAM)和Hetero FMAM(HFMAM)的异联想记忆性能是极不理想的. HLEMAM的异联想记忆性能及其鲁棒性是否会好一些呢?我们做了很多实验,下面是其中之一.

例4:设$X = \left( {\begin{array}{*{20}{c}} 2&4&8&{16} \\ 4&8&{16}&{32} \\ {16}&{32}&{64}&{128} \\ {32}&{64}&{128}&{256} \end{array}} \right),Y = (2{\text{ }}4{\text{ }}8{\text{ }}16).$

在MATLAB-6.5平台上,采用HLEMAM方法,VXYTXY均能实现对上述模式矩阵对(X,Y)的完全回忆记忆;采用HRMAM,HFMAM方法也能够对(X,Y)完全回忆记忆.但是,对于X的噪声模式,它们的联想记忆效果是不同的,具体实验结果见表 1.

Table 1 Noise performance of HLEMAM, HRMAM and HFMAM表 1 HLEMAM,HRMAM和HFMAM的噪声性能

表 1中:Y表示完全回忆记忆;N表示不完全回忆记忆;x11,x22,x34,x43,x44X的5个元素,以此作为选择变量,考察HLEMAM的噪声性能.从实验结果可以看出,HLEMAM抗混合噪声的能力明显优于HRMAM和HFMAM.

5 LEMAM的应用水平

为了了解HLEMAM的应用可能性,我们分别利用国际通用的UCI机器学习数据库中的Iris集和Wine集进行了实验,并与HRMAM,HFMAM进行了对比.实验在MATLAB平台上进行.

例5:Iris实验.

1936年,由Fisher出版的Iris数据集被广泛地用于分类判别和聚类分析.考虑到HLEMAM的运算特点和对数的性质,我们对个别数据的个别元素作了调整,主要是避开属性值取1,以利于对数计算.但这种调整并不影响结论的正确性.该数据集包含3个植物种属(Iris setosa,Iris versicolor和Iris virginica),每种50个样本,总共150个样本的数据.每个样本含5种属性数据,即萼片长(sepal length)、萼片宽(sepal width)、花瓣长(petal length)、花瓣宽(petal width)以及所属类别(speciesclassification).我们得到的“Fisher’s Iris Data”按照38行、4列排列,其中,前37行满行,而最后一行只有两列.实验中,我们按照“纵向取”原则,即:先取第1列,再取第2列、第3列,最后取第4列,选择IrisData 中前3列的前100个样本(第1类29个,第2类37个,第3类34个)作为训练集,后50个数据(第1类21个,第2类13个,第3类16个)作为预测分类的测试集.这样,训练时输入样本xl(l=1,…,100)为四维向量,输出样本yl(l=1,…,100)为一维向量;输入模式矩阵X为4x100矩阵,输出模式矩阵Y为1x100矩阵.预测分类时,输入模式矩阵X1为4x50矩阵,输出模式矩阵Y1为1x50矩阵.实验结果见表 2.

Table 2 Experimental result using Iris data set表 2 使用Iris数据集的实验结果

表 2中,括号包含的数据为正确回忆记忆的样本序号.实验结果表明:对于Iris问题,在训练精度和预测分类准确性方面,HLEMAM方法明显优于HRMAM,略高于或不低于HFMAM.

例6:Wine实验.

Wine数据集总共有178个样本,每个样本有14个属性(其中有一个是类别属性).实验设计中,选取128个样本作为HLEMAM的学习样本,其余50个作为预测样本,对比方法是HRMAM和HFMAM.由于数据中含有纯小数,为了方便对数计算,实验之前进行了数据预处理.将全部属性值扩大100倍,转换为带有整数部分的实数.另外,一般来说实数运算的结果会带有小数,而类别一般为整数.为此,在最后的回忆输出结果中使用了MATLAB函数库中的fix函数(向0圆整)或round函数(向最近整数圆整),根据实验效果,在二者中取最优.实验结果列于表 3.

Table 3 Experimental result using Wine data set表 3 使用Wine数据集的实验结果

结果显示:HRMAM,HFMAM,HLEMAM这3种方法的最好学习精度分别是0.09(11/128),0.59(76/128),0.65 (83/128),最好预测准确率分别是0(0/50),0.62(31/50)和0.66(33/50).

显然,在分类和预测方面,HLEMAM具有优势.尽管仍不够理想,但终归比HRMAM和HFMAM提供了更好的参考,这给异联想形态学记忆的应用带来了新的希望.

6 结束语

LEMAM是一种新颖的形态学联想记忆方法,理论分析证明:ALEMAM具有无限存储能力、一步回忆记忆,不存在收敛问题,具有一定的抵抗腐蚀噪声或膨胀噪声的能力,在输入完全或在一定的噪声范围内能够保证完全回忆记忆.在所做的实验中,ALEMAM方法的抗噪性好于ARMAM,不低于AFMAM.HLEMAM与HRMAM、 HFMAM一样,即使输入模式无噪声,也不能保证完全回忆记忆.但在一定条件下(满足本文提出的定理2或定理5), HLEMAM可以实现完全回忆记忆.Iris和Wine分类实验结果表明,HLEMAM的学习误差和分类预测准确性要优于HRMAM和HFMAM方法.因此,LEMAM是有研究价值的一种形态学联想记忆方法.本文的贡献在于提出了LEMAM方法,丰富了MAM的理论和实践,特别是进一步充实了MAM框架理论体系,也为深入研究和应用LEMAM方法奠定了基础并积累了资料.

致谢 国内外同行对MAM的研究成果是本文工作的基础,在此我们向这些同行表示感谢.

参考文献
[1] ChenSC. Advance in research on neural networks of discrete associative memories. In:Zhou ZH, Cao CG, eds. Proc. of the Neural Networks and Their Applications.Beijing: Tsinghua University Press, 2004. 295-320 (in Chinese withEnglish abstract).
[2] RitterGX, Sussner P, Diaz-de-Leon JL. Morphological associative memories. IEEETrans. on Neural Networks, 1998,9(2): 281-293 .
[3] ChenSC, Liu WL. Complex morphological associative memories and their performance analysis. Ruan Jian XueBao/Journal of Software, 2002,13(3):453-459 (in Chinese withEnglish abstract). http://www.jos.org.cn/1000-9825/13/453.htm
[4] WangM, Wang ST, Wu XJ. Initial results on fuzzy morphological associative memories.Acta Electronica Sinica, 2003,31(5): 690-693 (in Chinese withEnglish abstract).
[5] Wu XS,Wang ST. Bidirectional fuzzy morphological associative memory and its robustanalysis for random noise. Pattern Recognition and Artificial Intelligence, 2005,18(3):257-262 (in Chinese with English abstract).
[6] WangM, Chen SC. Enhanced FMAM basedon empirical kernel map. IEEETrans. on Neural Networks, 2005,16(3):557-564 .
[7] Feng NQ,Liu CH, Zhang CP, Xu JC, Wang SX. Research on the framework of morphologicalassociative memories. Chinese Journal of Computers, 2010,33(1):157-166 (in Chinesewith English abstract) .
[8] HaganMT, Demuth HB, Beale MH, Wrote; Dai K, Trans. Neural Network Design. Beijing:China Machine Press, 2002. 399-425 (in Chinese).
[9] Wu XS,Wang ST. Fuzzy morphological associative memory and their application instoring and recalling cell images. Journal of Image and Graphics,2006,11(10):1450-1455 (in Chinese with English abstract).
[10] VázquezRA, SossaH. Morphological hetero-associative memories applied to restore true-colorpatterns. Lecture Notes in Computer Sciences, 2009,5553:520-529 .
[11] Vázquez RA, Sossa H. Behavior of morphological associative memorieswith true-color image patterns. Neurocomputing, 2009, 73(1-3):225-244 .
[12] Feng NQ, Ao LH, Wang SX, Wang SX, Tian Y. Application of morphological associativememories to the associative recognition for images. Journal of He’nan NormalUniversity (Natural Science), 2010,38(3):44-47 (inChinese with English abstract).
[13] Feng NQ, Tian Y, Wang XF, Qin LJ, Qiao K.Grouping of morphological hetero-associative memories. Journal of He’nan NormalUniversity (Natural Science), 2012,40(2):155-158(in Chinese with English abstract).
[14] Sussner P, Esmi EL. Morphologicalperceptrons with competitive learning: Lattice-theoretical framework andconstructive learning algorithm. Information Sciences, 2011,181(10):1929-1950 .
[15] Valle ME, Sussner P. Storage and recall capabilities of fuzzymorphological associative memories with adjunction-based learning. NeuralNetworks, 2011,24(1):75-90 .
[16] Feng NQ, Qin LJ, Wang XF, Tian Y, Zhu XJ. Morphological associativememories applied to the implicit learning. Journal of He’nan Normal University (NaturalScience), 2013,41(3):156-159(in Chinese with English abstract).
[17] Sussner P, Valle ME. Morphological and certain fuzzy morphological associative memories for classification and prediction. In: Kaburlasos VG, Ritter GX, eds. ComputationalIntelligence Based on Lattice Theory, Series: Studies in ComputationalIntelligence, Vol.67. Springer-Verlag, 2007.149-171 .
[18] Araújo RA, Sussner P. An increasing hybrid morphological-linearperceptron with pseudo-gradient-based learning and phase adjustment forfinancial time series prediction. In: Proc. of the 2010 IEEE World Congress onComputational Intelligence (IJCNN). Barcelona, 2010.807-814 .
[19] Feng NQ, Wang XF, Mao WT, Ao LH. Heteroassociative morphologicalmemories based on four-dimensional storage. Neurocomputing, 2013,116:76-86 .
[1] 陈松灿.离散联想记忆神经网络研究进展.见:周志华,曹存根,编.神经网络及其应用.北京:清华大学出版社,2004.295-320.
[3] 陈松灿,刘伟龙.复形态联想记忆及其性能分析.软件学报,2002,13(3):453-459.http://www.jos.org.cn/1000-9825/13/453.htm
[4] 王敏,王士同,吴小俊.新模糊形态学联想记忆网络的初步研究.电子学报,2003,31(5):690-693.
[5] 吴锡生,王士同.双向模糊形态联想记忆网络及其抗随机噪声的研究.模式识别与人工智能,2005,18(3):257-262.
[7] 冯乃勤,刘春红,张聪品,徐久成,王双喜.形态学联想记忆框架研究.计算机学报,2010,33(1):157-166 .
[8] Hagan MT,Demuth HB,Beale MH,著;戴奎,译.神经网络设计.北京:机械工业出版社,2002.399-425.
[9] 吴锡生,王士同.模糊形态联想记忆网络及其在细胞图像联想识别中的应用.中国图像图形学报,2006,11(10):1450-1455.
[12] 冯乃勤,敖连辉,王善侠,王双喜,田勇.形态学联想记忆在图像联想识别上的应用.河南师范大学学报(自然科学版),2010,38(3): 44-47.
[13] 冯乃勤,田勇,王鲜芳,秦利娟,乔锟.基于分组划分的形态学异联想记忆研究.河南师范大学学报(自然科学版),2012,40(2): 155-158.
[16] 冯乃勤,秦利娟,王鲜芳,田勇,祝小静.形态学联想记忆在内隐学习中的应用.河南师范大学学报(自然科学版),2013,41(3): 156-159.