In this paper,we develop an inexact symmetric proximal alternating direction method of multipliers(ISPADMM)with two convex combinations(ISPADMM-tcc)for solving two-block separable convex optimization problems with lin...In this paper,we develop an inexact symmetric proximal alternating direction method of multipliers(ISPADMM)with two convex combinations(ISPADMM-tcc)for solving two-block separable convex optimization problems with linear equality constraints.Specifically,the convex combination technique is incorporated into the proximal centers of both subproblems.We then approximately solve these two subproblems based on relative error criteria.The global convergence,and O(1/N)ergodic sublinear convergence rate measured by the function value residual and constraint violation are established under some mild conditions,where N denotes the number of iterations.Finally,numerical experiments on solving the l1-regularized analysis sparse recovery and the elastic net regularization regression problems illustrate the feasibility and effectiveness of the proposed method.展开更多
基金supported by the National Natural Science Foundation of China(12171106)the Guangxi Science and Technology Program(AD23023001)+4 种基金the Natural Science Foundation of Guangxi Province(2023GXNSFBA026029)the National Natural Science Foundation of China(12401403,12361063)the Research Project of Guangxi Minzu University(2022KJQD03)the Middle-aged and Young Teachers’Basic Ability Promotion Project of Guangxi Province(2023KY0168)the Xiangsihu Young Scholars Innovative Research Team of Guangxi Minzu University(2022GXUNXSHQN04).
文摘In this paper,we develop an inexact symmetric proximal alternating direction method of multipliers(ISPADMM)with two convex combinations(ISPADMM-tcc)for solving two-block separable convex optimization problems with linear equality constraints.Specifically,the convex combination technique is incorporated into the proximal centers of both subproblems.We then approximately solve these two subproblems based on relative error criteria.The global convergence,and O(1/N)ergodic sublinear convergence rate measured by the function value residual and constraint violation are established under some mild conditions,where N denotes the number of iterations.Finally,numerical experiments on solving the l1-regularized analysis sparse recovery and the elastic net regularization regression problems illustrate the feasibility and effectiveness of the proposed method.