期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Improved gradient iterative algorithms for solving Lyapunov matrix equations 被引量:1
1
作者 顾传青 范伟薇 《Journal of Shanghai University(English Edition)》 CAS 2008年第5期395-399,共5页
In this paper, an improved gradient iterative (GI) algorithm for solving the Lyapunov matrix equations is studied. Convergence of the improved method for any initial value is proved with some conditions. Compared wi... In this paper, an improved gradient iterative (GI) algorithm for solving the Lyapunov matrix equations is studied. Convergence of the improved method for any initial value is proved with some conditions. Compared with the GI algorithm, the improved algorithm reduces computational cost and storage. Finally, the algorithm is tested with GI several numerical examples. 展开更多
关键词 gradient iterative (GI) algorithm improved gradient iteration (GI) algorithm Lyapunov matrix equations convergence factor
在线阅读 下载PDF
A Gradient Iteration Method for Functional Linear Regression in Reproducing Kernel Hilbert Spaces
2
作者 Hongzhi Tong Michael Ng 《Annals of Applied Mathematics》 2022年第3期280-295,共16页
We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the cla... We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the classical Tikhonov regularization,to prevent the iteration from an overfitting function.Under mild conditions,we obtain upper bounds,essentially matching the known minimax lower bounds,for excess prediction risk.An almost sure convergence is also established for the proposed algorithm. 展开更多
关键词 gradient iteration algorithm functional linear regression reproducing kernel Hilbert space early stopping convergence rates
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部