摘要
提出了线性化逐层优化MLP训练算法(LOLL).LOLL采用循环方式逐层对MLP的连接权值进行训练.训练连接权值时用一阶泰勒级数表示神经元的非线性激活函数以实现神经网络的线性化,使MLP的训练问题转化为一个线性问题.同时,为保证神经网络线性化条件不被破坏,LOLL通过在神经网络的误差函数中计入部分线性化误差限制参数的改变幅度,对神经网络的误差函数进行了修正.实验结果显示,LOLL训练算法的速度比传统的BP算法快4倍,用它构成的语音信号非线性预测器有较好的预测性能.
The main idea of this algorithm was linearization of MLP and the entire MLP was trained layer by layer. The activation function of neuron was replaced by its first order Taylor series expansion for linearization. A penalty term proportional to linearization error was added to the cost function to guarantee the validation of linearization. The experimental results show that the training speed of LOLL reaches 4 times the speed of conventional BP, and the speech nonlinear predictor based on LOLL algorithm performs well.
出处
《上海交通大学学报》
EI
CAS
CSCD
北大核心
1999年第1期15-18,共4页
Journal of Shanghai Jiaotong University
基金
国家自然科学基金
关键词
语音信号处理
多层线性感知器
训练算法
MLP
neural networks
multi layer perceptron(MLP)
fast algorithm
speech processing
speech analysis