In this paper, an improved gradient iterative (GI) algorithm for solving the Lyapunov matrix equations is studied. Convergence of the improved method for any initial value is proved with some conditions. Compared wi...In this paper, an improved gradient iterative (GI) algorithm for solving the Lyapunov matrix equations is studied. Convergence of the improved method for any initial value is proved with some conditions. Compared with the GI algorithm, the improved algorithm reduces computational cost and storage. Finally, the algorithm is tested with GI several numerical examples.展开更多
We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the cla...We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the classical Tikhonov regularization,to prevent the iteration from an overfitting function.Under mild conditions,we obtain upper bounds,essentially matching the known minimax lower bounds,for excess prediction risk.An almost sure convergence is also established for the proposed algorithm.展开更多
基金Project supported by the National Natural Science Foundation of China (Grant No.10271074), and the Special Funds for Major Specialities of Shanghai Education Commission (Grant No.J50101)
文摘In this paper, an improved gradient iterative (GI) algorithm for solving the Lyapunov matrix equations is studied. Convergence of the improved method for any initial value is proved with some conditions. Compared with the GI algorithm, the improved algorithm reduces computational cost and storage. Finally, the algorithm is tested with GI several numerical examples.
基金supported in part by National Natural Science Foundation of China(Grant No.11871438)supported in part by the HKRGC GRF Nos.12300218,12300519,17201020,17300021,C1013-21GF,C7004-21GFJoint NSFC-RGC N-HKU76921。
文摘We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the classical Tikhonov regularization,to prevent the iteration from an overfitting function.Under mild conditions,we obtain upper bounds,essentially matching the known minimax lower bounds,for excess prediction risk.An almost sure convergence is also established for the proposed algorithm.