摘要
为克服传统储备池方法缺乏良好在线学习算法的问题,同时考虑到储备池本身存在的不适定问题,本文提出一种储备池在线稀疏学习算法,对储备池目标函数施加L1正则化约束,并采用截断梯度算法在线近似求解.所提算法在对储备池输出权值进行在线调整的同时,可对储备池输出权值的稀疏性进行有效控制,有效保证了网络的泛化性能.理论分析和仿真实例证明所提算法的有效性.
In order to overcome the lack of effective online learning method for echo state networks and to solve the illposed problem of reservoir,an effective online sparse learning algorithm is proposed for echo state networks in this paper. An L1 regularization constraint is added to the objective function of reservoir,and a truncated gradient algorithm is used to approximately solve the problem online. The proposed method can adjust the output weights of reservoir online,control the sparsity of the output weights,and ensure the generalization performance. Theoretical analysis and simulation results demonstrate the effectiveness of the algorithm.
出处
《自动化学报》
EI
CSCD
北大核心
2011年第12期1536-1540,共5页
Acta Automatica Sinica
基金
国家自然科学基金(61074096)资助~~
关键词
递归网络
回声状态网络
稀疏
在线
优化
Recurrent neural networks
echo state networks (ESNs)
sparse
online
optimization