期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Convergence of Gradient Algorithms for Nonconvex C^(1+α) Cost Functions
1
作者 Zixuan WANG Shanjian TANG 《Chinese Annals of Mathematics,Series B》 SCIE CSCD 2023年第3期445-464,共20页
This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting.A class of stochastic momentum methods,including stochastic gradient descent,heavy ball and Neste... This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting.A class of stochastic momentum methods,including stochastic gradient descent,heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions.Based on the convergence result of expected gradients,the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings.It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H?lder continuity.As a byproduct,the authors apply a localization procedure to extend the results to stochastic stepsizes. 展开更多
关键词 gradient descent methods Nonconvex optimization Accelerated gradient descent Heavy-ball momentum
原文传递
ACCELERATED OPTIMIZATION WITH ORTHOGONALITY CONSTRAINTS 被引量:1
2
作者 Jonathan W.Siegel 《Journal of Computational Mathematics》 SCIE CSCD 2021年第2期207-226,共20页
We develop a generalization of Nesterov’s accelerated gradient descent method which is designed to deal with orthogonality constraints.To demonstrate the effectiveness of our method,we perform numerical experiments w... We develop a generalization of Nesterov’s accelerated gradient descent method which is designed to deal with orthogonality constraints.To demonstrate the effectiveness of our method,we perform numerical experiments which demonstrate that the number of iterations scales with the square root of the condition number,and also compare with existing state-of-the-art quasi-Newton methods on the Stiefel manifold.Our experiments show that our method outperforms existing state-of-the-art quasi-Newton methods on some large,ill-conditioned problems. 展开更多
关键词 Riemannian optimization Stiefel manifold Accelerated gradient descent Eigenvector problems Electronic structure calculations
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部