Feedforward multi layer neural networks have very strong mapping capability that is based on the non linearity of the activation function, however, the non linearity of the activation function can cause the multiple ...Feedforward multi layer neural networks have very strong mapping capability that is based on the non linearity of the activation function, however, the non linearity of the activation function can cause the multiple local minima on the learning error surfaces, which affect the learning rate and solving optimal weights. This paper proposes a learning method linearizing non linearity of the activation function and discusses its merits and demerits theoretically.展开更多
This paper presents a study on the improvement of MLNNs(multi-layer neural networks)performance by an activity function for multi logic training patterns.Our model network has L hidden layers of two inputs and three,f...This paper presents a study on the improvement of MLNNs(multi-layer neural networks)performance by an activity function for multi logic training patterns.Our model network has L hidden layers of two inputs and three,four to six output training using BP(backpropagation)neural network.We used logic functions of XOR(exclusive OR),OR,AND,NAND(not AND),NXOR(not exclusive OR)and NOR(not OR)as the multi logic teacher signals to evaluate the training performance of MLNNs by an activity function for information and data enlargement in signal processing(synaptic divergence state).We specifically used four activity functions from which we modified one and called it L&exp.function as it could give the highest training abilities compared to the original activity functions of Sigmoid,ReLU and Step during simulation and training in the network.And finally,we propose L&exp.function as being good for MLNNs and it may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple training logic patterns hence can be adopted in machine deep learning.展开更多
In this paper,the global robust exponential stability is considered for a class of neural networks with parametric uncer- tainties and time-varying delay.By using Lyapunov functional method,and by resorting to the new...In this paper,the global robust exponential stability is considered for a class of neural networks with parametric uncer- tainties and time-varying delay.By using Lyapunov functional method,and by resorting to the new technique for estimating the upper bound of the derivative of the Lyapunov functional,some less conservative exponential stability criteria are derived in terms of linear matrix inequalities (LMIs).Numerical examples are presented to show the effectiveness of the proposed method.展开更多
文摘Feedforward multi layer neural networks have very strong mapping capability that is based on the non linearity of the activation function, however, the non linearity of the activation function can cause the multiple local minima on the learning error surfaces, which affect the learning rate and solving optimal weights. This paper proposes a learning method linearizing non linearity of the activation function and discusses its merits and demerits theoretically.
文摘This paper presents a study on the improvement of MLNNs(multi-layer neural networks)performance by an activity function for multi logic training patterns.Our model network has L hidden layers of two inputs and three,four to six output training using BP(backpropagation)neural network.We used logic functions of XOR(exclusive OR),OR,AND,NAND(not AND),NXOR(not exclusive OR)and NOR(not OR)as the multi logic teacher signals to evaluate the training performance of MLNNs by an activity function for information and data enlargement in signal processing(synaptic divergence state).We specifically used four activity functions from which we modified one and called it L&exp.function as it could give the highest training abilities compared to the original activity functions of Sigmoid,ReLU and Step during simulation and training in the network.And finally,we propose L&exp.function as being good for MLNNs and it may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple training logic patterns hence can be adopted in machine deep learning.
基金Natural Science Foundation of Henan Education Department (No.2007120005).
文摘In this paper,the global robust exponential stability is considered for a class of neural networks with parametric uncer- tainties and time-varying delay.By using Lyapunov functional method,and by resorting to the new technique for estimating the upper bound of the derivative of the Lyapunov functional,some less conservative exponential stability criteria are derived in terms of linear matrix inequalities (LMIs).Numerical examples are presented to show the effectiveness of the proposed method.