摘要
神经网络由于其非线性处理能力强,性能稳定等特点得到了广泛应用和研究。主要应用于模式识别、信号处理、知识工程、专家系统、优化组合、机器人控制等。神经网络中使用最为广泛的就是前馈神经网络。其网络权值学习算法中影响最大的就是误差反向传播算法(back-propagation简称BP算法)。BP算法存在局部极小点,收敛速度慢等缺点。基于优化理论的Levenberg-Marquardt算法忽略了二阶项。该文讨论当误差不为零或者不为线性函数即二阶项S(W)不能忽略时的Hesse矩阵的近似计算,进而训练网络。
Neural network is widely used and studied for its nonlinear disposal capacity and good stability. It mainly applied in the aspects of pattern recognition, signal processing, knowledge engineering, expert system, combinatorial optimum, and robot control. For feed-forward neural network, BP algorithm has great influence among those neural network algorithms. The shortages of BP algorithm are its local minima and slow training speed. The second order item is omitted in Levenberg-Marquardt algorithm based on optimal theory. This paper presents an approximation calculation for Hesse matrix and then trains for networks.
出处
《山西电子技术》
2006年第6期78-79,共2页
Shanxi Electronic Technology