摘要
本文通过增加网络的神经元,提出了异联想记忆神经网络模型的快速增强算法,它能存储任意给定的训练模式对集,即对于训练模式对的数目以及它们之间相关性的强弱没有限制.快速增强算法在X-域和Y-域增强的神经元个数分别至多为原先Y-域和X-域的神经元个数.快速增强算法设计出的网络连接权只取值1,0或-1,因而网络易于硬件电路实现和光学实现.计算机实验结果表明,与虚构增强算法相比,快速增强算法大大减少了网络的附加连接权.
A new and quick augmentation algorithm, by augmenting extra connection weights, of bidirectional associative memory is proposed in this paper. The quick augmentation algorithm guarantees to store any given training pairs. The augmenting neurons number in X-domain and Y-domain are not more than the original dimension of Y-domain and X-domain, respectively. The connection weights designed by quick augmentation algorithm are 1, 0 or -1, hence, the designed network is suited for optical implementation. Computer experimental results demonstrate that the quick augmentation method saves much more extra connection weights than the dummy augmentation method.
出处
《模式识别与人工智能》
EI
CSCD
北大核心
1996年第1期37-44,共8页
Pattern Recognition and Artificial Intelligence
基金
国家攀登计划
国家自然科学基金
关键词
异联想记忆模型
快速增强算法
BAM
神经网络
Bidirectional Associative Memory, Quick Augmentation Algorithm, Dummy Augmentation Algorithm, Optical Implementation.