期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
New Estimates for the Rate of Convergence of the Method of Subspace Corrections 被引量:1
1
作者 Durkbin Cho Jinchao Xu Ludmil Zikatanov 《Numerical Mathematics(Theory,Methods and Applications)》 SCIE 2008年第1期44-56,共13页
We discuss estimates for the rate of convergence of the method of successive subspace corrections in terms of condition number estimate for the method of parallel subspace corrections.We provide upper bounds and in a ... We discuss estimates for the rate of convergence of the method of successive subspace corrections in terms of condition number estimate for the method of parallel subspace corrections.We provide upper bounds and in a special case,a lower bound for preconditioners defined via the method of successive subspace corrections. 展开更多
关键词 Method of subspace corrections preconditioning convergence rate of linear iterative method
在线阅读 下载PDF
A Coordinate Gradient Descent Method for Nonsmooth Nonseparable Minimization 被引量:9
2
作者 Zheng-Jian Bai Michael K. Ng Liqun Qi 《Numerical Mathematics(Theory,Methods and Applications)》 SCIE 2009年第4期377-402,共26页
This paper presents a coordinate gradient descent approach for minimizing the sum of a smooth function and a nonseparable convex function.We find a search direction by solving a subproblem obtained by a second-order a... This paper presents a coordinate gradient descent approach for minimizing the sum of a smooth function and a nonseparable convex function.We find a search direction by solving a subproblem obtained by a second-order approximation of the smooth function and adding a separable convex function.Under a local Lipschitzian error bound assumption,we show that the algorithm possesses global and local linear convergence properties.We also give some numerical tests(including image recovery examples) to illustrate the efficiency of the proposed method. 展开更多
关键词 Coordinate descent global convergence linear convergence rate
在线阅读 下载PDF
A Variable Metric Extrapolation Proximal Iterative Hard Thresholding Method
3
作者 Xue Zhang Xiao-Qun Zhang 《Journal of the Operations Research Society of China》 2025年第1期161-183,共23页
In this paper,we propose a variable metric extrapolation proximal iterative hard thresholding(VMEPIHT)method for nonconvex\ell_0-norm sparsity regularization problem which has wide applications in signal and image pro... In this paper,we propose a variable metric extrapolation proximal iterative hard thresholding(VMEPIHT)method for nonconvex\ell_0-norm sparsity regularization problem which has wide applications in signal and image processing,machine learning and so on.The VMEPIHT method is based on the forward-backward splitting(FBS)method,and variable metric strategy is employed in the extrapolation step to speed up the algorithm.The proposed method’s convergence,linear convergence rate and superlinear convergence rate are shown under appropriate assumptions.Finally,we conduct numerical experiments on compressed sensing problem and CT image reconstruction problem to confirm the efficiency of the proposed method,compared with other state-of-the-art methods. 展开更多
关键词 Variable metric Iterative hard thresholding linear convergence rate Superlinear convergence rate
原文传递
Distributed Optimization and Scaling Design for Solving Sylvester Equations
4
作者 CHENG Songsong YU Xin +2 位作者 ZENG Xianlin LIANG Shu HONG Yiguang 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2024年第6期2487-2510,共24页
This paper develops distributed algorithms for solving Sylvester equations.The authors transform solving Sylvester equations into a distributed optimization problem,unifying all eight standard distributed matrix struc... This paper develops distributed algorithms for solving Sylvester equations.The authors transform solving Sylvester equations into a distributed optimization problem,unifying all eight standard distributed matrix structures.Then the authors propose a distributed algorithm to find the least squares solution and achieve an explicit linear convergence rate.These results are obtained by carefully choosing the step-size of the algorithm,which requires particular information of data and Laplacian matrices.To avoid these centralized quantities,the authors further develop a distributed scaling technique by using local information only.As a result,the proposed distributed algorithm along with the distributed scaling design yields a universal method for solving Sylvester equations over a multi-agent network with the constant step-size freely chosen from configurable intervals.Finally,the authors provide three examples to illustrate the effectiveness of the proposed algorithms. 展开更多
关键词 Distributed optimization least squares solution linear convergence rate step-size interval Sylvester equation
原文传递
A QUASI-NEWTON METHOD IN INFINITE-DIMENSIONAL SPACES AND ITS APPLICATION FOR SOLVING A PARABOLIC INVERSE PROBLEM
5
作者 Wen-huan Yu(Department of Mathematics, Tianjin University, Tianjin 300072, P.R. China.) 《Journal of Computational Mathematics》 SCIE CSCD 1998年第4期305-318,共14页
A Quasi-Newton method in Infinite-dimensional Spaces (QNIS) for solving operator equations is presellted and the convergence of a sequence generated by QNIS is also proved in the paper. Next, we suggest a finite-dimen... A Quasi-Newton method in Infinite-dimensional Spaces (QNIS) for solving operator equations is presellted and the convergence of a sequence generated by QNIS is also proved in the paper. Next, we suggest a finite-dimensional implementation of QNIS and prove that the sequence defined by the finite-dimensional algorithm converges to the root of the original operator equation providing that the later exists and that the Frechet derivative of the governing operator is invertible. Finally, we apply QNIS to an inverse problem for a parabolic differential equation to illustrate the efficiency of the finite-dimensional algorithm. 展开更多
关键词 Quasi-Newton method parabolic differential equation inverse problems in partial differential equations linear and Q-superlinear rates of convergence
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部