期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Analysis and design of multivalued many-to-one associative memory driven by external inputs
1
作者 Qiang Fang Hao Zhang 《Chinese Physics B》 2025年第8期331-341,共11页
This paper proposes a novel multivalued recurrent neural network model driven by external inputs,along with two innovative learning algorithms.By incorporating a multivalued activation function,the proposed model can ... This paper proposes a novel multivalued recurrent neural network model driven by external inputs,along with two innovative learning algorithms.By incorporating a multivalued activation function,the proposed model can achieve multivalued many-to-one associative memory,and the newly developed algorithms enable effective storage of many-to-one patterns in the coefficient matrix while maintaining the indispensability of inputs in many-to-one associative memory.The proposed learning algorithm addresses a critical limitation of existing models which fail to ensure completely erroneous outputs when facing partial input missing in many-to-one associative memory tasks.The methodology is rigorously derived through theoretical analysis,incorporating comprehensive verification of both the existence and global exponential stability of equilibrium points.Demonstrative examples are provided in the paper to show the effectiveness of the proposed theory. 展开更多
关键词 many-to-one associative memories recurrent neural network global exponential stability external input
原文传递
Synthesization of high-capacity auto-associative memories using complex-valued neural networks 被引量:1
2
作者 黄玉娇 汪晓妍 +1 位作者 龙海霞 杨旭华 《Chinese Physics B》 SCIE EI CAS CSCD 2016年第12期194-201,共8页
In this paper, a novel design procedure is proposed for synthesizing high-capacity auto-associative memories based on complex-valued neural networks with real-imaginary-type activation functions and constant delays. S... In this paper, a novel design procedure is proposed for synthesizing high-capacity auto-associative memories based on complex-valued neural networks with real-imaginary-type activation functions and constant delays. Stability criteria dependent on external inputs of neural networks are derived. The designed networks can retrieve the stored patterns by external inputs rather than initial conditions. The derivation can memorize the desired patterns with lower-dimensional neural networks than real-valued neural networks, and eliminate spurious equilibria of complex-valued neural networks. One numerical example is provided to show the effectiveness and superiority of the presented results. 展开更多
关键词 associative memory complex-valued neural network real-imaginary-type activation function external input
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部