摘要
为了进一步提高极限学习机的学习性能,将并行学习的思想引入单层极限学习机,并提出了基于并行学习的多层极限学习机模型。实验结果表明,该模型比传统的单层极限学习机、多层极限学习机以及传统基于误差反向学习的深度学习模型分类准确率高、收敛速度快。
In order to further improve the learning performance of' ELM, this paper introduced parallel learning into single-la- yer ELM, and proposed a parallel based multi-layer ELM. Experimental results show that the proposed model is more accurate and faster than the traditional single-layer ELM, multi-layer ELM and traditional deep learning model based on error inversion learning.
出处
《计算机应用研究》
CSCD
北大核心
2018年第2期459-461,共3页
Application Research of Computers
基金
江苏省产学研合作项目(BY2015019-30)
关键词
神经网络
稀疏编码
极限学习机
并行学习
neural networks
sparse coding
extreme learning machine
parallel learning