期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Two Nonmonotone Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold
1
作者 Jin-chao ZHANG Juan GAO +1 位作者 Ya-kui HUANG Xin-wei LIU 《Acta Mathematicae Applicatae Sinica》 2026年第1期105-120,共16页
We propose two nonmonotone retraction-based proximal gradient methods for solving a class of nonconvex nonsmooth optimization problems over the Stiefel manifold.The proposed methods are equipped with the descent direc... We propose two nonmonotone retraction-based proximal gradient methods for solving a class of nonconvex nonsmooth optimization problems over the Stiefel manifold.The proposed methods are equipped with the descent direction obtained by a proximal mapping restricted in tangent space of the manifold and the BarzilaiBorwein stepsizes determined by two recent iteration points and the corresponding descent directions.By employing,respectively,the Grippo-Lampariello-Lucidi nonmonotone line search strategy and the Dai-Fletcher nonmonotone line search strategy,our proposed methods are proved to be globally convergent.Analysis on the iteration complexity for obtaining an?-stationary solution is provided.Numerical results on the sparse principle component analysis problems demonstrate the efficiency of our methods. 展开更多
关键词 Stiefel manifold nonconvex nonsmooth optimization iteration complexity nonmonotone line search proximal gradient method
原文传递
A Hybrid and Inexact Algorithm for Nonconvex and Nonsmooth Optimization
2
作者 WANG Yiyang SONG Xiaoliang 《Journal of Systems Science & Complexity》 2025年第3期1330-1350,共21页
The problem of nonconvex and nonsmooth optimization(NNO)has been extensively studied in the machine learning community,leading to the development of numerous fast and convergent numerical algorithms.Existing algorithm... The problem of nonconvex and nonsmooth optimization(NNO)has been extensively studied in the machine learning community,leading to the development of numerous fast and convergent numerical algorithms.Existing algorithms typically employ unified iteration schemes and require explicit solutions to subproblems for ensuring convergence.However,these inflexible iteration schemes overlook task-specific details and may encounter difficulties in providing explicit solutions to subproblems.In contrast,there is evidence suggesting that practical applications can benefit from approximately solving subproblems;however,many existing works fail to establish the theoretical validity of such approximations.In this paper,the authors propose a hybrid inexact proximal alternating method(hiPAM),which addresses a general NNO problem with coupled terms while overcoming all aforementioned challenges.The proposed hiPAM algorithm offers a flexible yet highly efficient approach by seamlessly integrating any efficient methods for approximate subproblem solving that cater to specificities.Additionally,the authors have devised a simple yet implementable stopping criterion that generates a Cauchy sequence and ultimately converges to a critical point of the original NNO problem.The proposed numerical experiments using both simulated and real data have demonstrated that hiPAM represents an exceedingly efficient and robust approach to NNO problems. 展开更多
关键词 Hybrid inexact proximal alternating method inexact minimization criteria machine learning nonconvex and nonsmooth optimization
原文传递
Extrapolated Smoothing Descent Algorithm for Constrained Nonconvex and Nonsmooth Composite Problems
3
作者 Yunmei CHEN Hongcheng LIU Weina WANG 《Chinese Annals of Mathematics,Series B》 SCIE CSCD 2022年第6期1049-1070,共22页
In this paper,the authors propose a novel smoothing descent type algorithm with extrapolation for solving a class of constrained nonsmooth and nonconvex problems,where the nonconvex term is possibly nonsmooth.Their al... In this paper,the authors propose a novel smoothing descent type algorithm with extrapolation for solving a class of constrained nonsmooth and nonconvex problems,where the nonconvex term is possibly nonsmooth.Their algorithm adopts the proximal gradient algorithm with extrapolation and a safe-guarding policy to minimize the smoothed objective function for better practical and theoretical performance.Moreover,the algorithm uses a easily checking rule to update the smoothing parameter to ensure that any accumulation point of the generated sequence is an(afne-scaled)Clarke stationary point of the original nonsmooth and nonconvex problem.Their experimental results indicate the effectiveness of the proposed algorithm. 展开更多
关键词 Constrained nonconvex and nonsmooth optimization Smooth approximation Proximal gradient algorithm with extrapolation Gradient descent algorithm Image reconstruction
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部