Accelerated proximal gradient methods have recently been developed for solving quasi-static incremental problems of elastoplastic analysis with some different yield criteria.It has been demonstrated through numerical ...Accelerated proximal gradient methods have recently been developed for solving quasi-static incremental problems of elastoplastic analysis with some different yield criteria.It has been demonstrated through numerical experiments that these methods can outperform conventional optimization-based approaches in computational plasticity.However,in literature these algorithms are described individually for specific yield criteria,and hence there exists no guide for application of the algorithms to other yield criteria.This short paper presents a general form of algorithm design,independent of specific forms of yield criteria,that unifies the existing proximal gradient methods.Clear interpretation is also given to each step of the presented general algorithm so that each update rule is linked to the underlying physical laws in terms of mechanical quantities.展开更多
In this paper,we present an SQP-type proximal gradient method(SQP-PG)for composite optimization problems with equality constraints.At each iteration,SQP-PG solves a subproblem to get the search direction,and takes an ...In this paper,we present an SQP-type proximal gradient method(SQP-PG)for composite optimization problems with equality constraints.At each iteration,SQP-PG solves a subproblem to get the search direction,and takes an exact penalty function as the merit function to determine if the trial step is accepted.The global convergence of the SQP-PG method is proved and the iteration complexity for obtaining an-stationary point is analyzed.We also establish the local linear convergence result of the SQP-PG method under the second-order sufficient condition.Numerical results demonstrate that,compared to the state-of-the-art algorithms,SQP-PG is an effective method for equality constrained composite optimization problems.展开更多
We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping,regularized by the sum of both l1-norm a...We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping,regularized by the sum of both l1-norm and l2-norm of the optimization variables.This class of problems arise naturally from applications in sparse group Lasso,which is a popular technique for variable selection.An effective approach to solve such problems is by the Proximal Gradient Method(PGM).In this paper we prove a local error bound around the optimal solution set for this problem and use it to establish the linear convergence of the PGM method without assuming strong convexity of the overall objective function.展开更多
In this paper,we address the issue of peak-to-average power ratio(PAPR)reduction in large-scale multiuser multiple-input multiple-output(MU-MIMO)orthogonal frequency-division multiplexing(OFDM)systems.PAPR reduction a...In this paper,we address the issue of peak-to-average power ratio(PAPR)reduction in large-scale multiuser multiple-input multiple-output(MU-MIMO)orthogonal frequency-division multiplexing(OFDM)systems.PAPR reduction and the multiuser interference(MUI)cancellation problem are jointly formulated as an l_(∞)-norm based composite convex optimization problem,which can be solved efficiently using the iterative proximal gradient method.The proximal operator associated with l_(∞)-norm is evaluated using a low-cost sorting algorithm.The proposed method adaptively chooses the step size to accelerate convergence.Simulation results reveal that the proximal gradient method converges swiftly while provid-ing considerable PAPR reduction and lower out-of-band radiation.展开更多
In this paper,we propose a modified proximal gradient method for solving a class of nonsmooth convex optimization problems,which arise in many contemporary statistical and signal processing applications.The proposed m...In this paper,we propose a modified proximal gradient method for solving a class of nonsmooth convex optimization problems,which arise in many contemporary statistical and signal processing applications.The proposed method adopts a new scheme to construct the descent direction based on the proximal gradient method.It is proven that the modified proximal gradient method is Q-linearly convergent without the assumption of the strong convexity of the objective function.Some numerical experiments have been conducted to evaluate the proposed method eventually.展开更多
In this paper,we consider 3 D tomographic reconstruction for axially symmetric objects from a single radiograph formed by cone-beam X-rays.All contemporary density reconstruction methods in high-energy X-ray radiograp...In this paper,we consider 3 D tomographic reconstruction for axially symmetric objects from a single radiograph formed by cone-beam X-rays.All contemporary density reconstruction methods in high-energy X-ray radiography are based on the assumption that the cone beam can be treated as fan beams located at parallel planes perpendicular to the symmetric axis,so that the density of the whole object can be recovered layer by layer.Considering the relationship between different layers,we undertake the cone-beam global reconstruction to solve the ambiguity effect at the material interfaces of the reconstruction results.In view of the anisotropy of classical discrete total variations,a new discretization of total variation which yields sharp edges and has better isotropy is introduced in our reconstruction model.Furthermore,considering that the object density consists of continually changing parts and jumps,a high-order regularization term is introduced.The final hybrid regularization model is solved using the alternating proximal gradient method,which was recently applied in image processing.Density reconstruction results are presented for simulated radiographs,which shows that the proposed method has led to an improvement in terms of the preservation of edge location.展开更多
In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective func...In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective function is a classical strongly convex function which may not be smooth.Motivated by the smoothing approaches,we modify the classical bi-level gradient sequential averaging method to solve the bi-level optimization problem.Under some mild conditions,we obtain the convergence rate of the generated sequence,and then based on the analysis framework of S-FISTA,we show the global convergence rate of the proposed algorithm.展开更多
In this paper,we discuss a splitting method for group Lasso.By assuming that the sequence of the step lengths has positive lower bound and positive upper bound(unrelated to the given problem data),we prove its Q-linea...In this paper,we discuss a splitting method for group Lasso.By assuming that the sequence of the step lengths has positive lower bound and positive upper bound(unrelated to the given problem data),we prove its Q-linear rate of convergence of the distance sequence of the iterates to the solution set.Moreover,we make comparisons with convergence of the proximal gradient method analyzed very recently.展开更多
In this paper,we consider a block-structured convex optimization model,where in the objective the block variables are nonseparable and they are further linearly coupled in the constraint.For the 2-block case,we propos...In this paper,we consider a block-structured convex optimization model,where in the objective the block variables are nonseparable and they are further linearly coupled in the constraint.For the 2-block case,we propose a number of first-order algorithms to solve this model.First,the alternating direction method of multipliers(ADMM)is extended,assuming that it is easy to optimize the augmented Lagrangian function with one block of variables at each time while fixing the other block.We prove that O(1/t)iteration complexity bound holds under suitable conditions,where t is the number of iterations.If the subroutines of the ADMM cannot be implemented,then we propose new alternative algorithms to be called alternating proximal gradient method of multipliers,alternating gradient projection method of multipliers,and the hybrids thereof.Under suitable conditions,the O(1/t)iteration complexity bound is shown to hold for all the newly proposed algorithms.Finally,we extend the analysis for the ADMM to the general multi-block case.展开更多
文摘Accelerated proximal gradient methods have recently been developed for solving quasi-static incremental problems of elastoplastic analysis with some different yield criteria.It has been demonstrated through numerical experiments that these methods can outperform conventional optimization-based approaches in computational plasticity.However,in literature these algorithms are described individually for specific yield criteria,and hence there exists no guide for application of the algorithms to other yield criteria.This short paper presents a general form of algorithm design,independent of specific forms of yield criteria,that unifies the existing proximal gradient methods.Clear interpretation is also given to each step of the presented general algorithm so that each update rule is linked to the underlying physical laws in terms of mechanical quantities.
基金supported by the National Natural Science Foundation of China(Grant No.72394365).
文摘In this paper,we present an SQP-type proximal gradient method(SQP-PG)for composite optimization problems with equality constraints.At each iteration,SQP-PG solves a subproblem to get the search direction,and takes an exact penalty function as the merit function to determine if the trial step is accepted.The global convergence of the SQP-PG method is proved and the iteration complexity for obtaining an-stationary point is analyzed.We also establish the local linear convergence result of the SQP-PG method under the second-order sufficient condition.Numerical results demonstrate that,compared to the state-of-the-art algorithms,SQP-PG is an effective method for equality constrained composite optimization problems.
基金This work was partially supported by the National Natural Science Foundation of China(Nos.61179033,DMS-1015346)。
文摘We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping,regularized by the sum of both l1-norm and l2-norm of the optimization variables.This class of problems arise naturally from applications in sparse group Lasso,which is a popular technique for variable selection.An effective approach to solve such problems is by the Proximal Gradient Method(PGM).In this paper we prove a local error bound around the optimal solution set for this problem and use it to establish the linear convergence of the PGM method without assuming strong convexity of the overall objective function.
文摘In this paper,we address the issue of peak-to-average power ratio(PAPR)reduction in large-scale multiuser multiple-input multiple-output(MU-MIMO)orthogonal frequency-division multiplexing(OFDM)systems.PAPR reduction and the multiuser interference(MUI)cancellation problem are jointly formulated as an l_(∞)-norm based composite convex optimization problem,which can be solved efficiently using the iterative proximal gradient method.The proximal operator associated with l_(∞)-norm is evaluated using a low-cost sorting algorithm.The proposed method adaptively chooses the step size to accelerate convergence.Simulation results reveal that the proximal gradient method converges swiftly while provid-ing considerable PAPR reduction and lower out-of-band radiation.
基金the National Natural Science Foundation of China(No.61179033).
文摘In this paper,we propose a modified proximal gradient method for solving a class of nonsmooth convex optimization problems,which arise in many contemporary statistical and signal processing applications.The proposed method adopts a new scheme to construct the descent direction based on the proximal gradient method.It is proven that the modified proximal gradient method is Q-linearly convergent without the assumption of the strong convexity of the objective function.Some numerical experiments have been conducted to evaluate the proposed method eventually.
基金supported by National Postdoctoral Program for Innovative Talents(BX201700038)supported by NSFC(11571003)+1 种基金supported by NSFC(11675021)supported by Beijing Natural Science Foundation(Z180002)。
文摘In this paper,we consider 3 D tomographic reconstruction for axially symmetric objects from a single radiograph formed by cone-beam X-rays.All contemporary density reconstruction methods in high-energy X-ray radiography are based on the assumption that the cone beam can be treated as fan beams located at parallel planes perpendicular to the symmetric axis,so that the density of the whole object can be recovered layer by layer.Considering the relationship between different layers,we undertake the cone-beam global reconstruction to solve the ambiguity effect at the material interfaces of the reconstruction results.In view of the anisotropy of classical discrete total variations,a new discretization of total variation which yields sharp edges and has better isotropy is introduced in our reconstruction model.Furthermore,considering that the object density consists of continually changing parts and jumps,a high-order regularization term is introduced.The final hybrid regularization model is solved using the alternating proximal gradient method,which was recently applied in image processing.Density reconstruction results are presented for simulated radiographs,which shows that the proposed method has led to an improvement in terms of the preservation of edge location.
文摘In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective function is a classical strongly convex function which may not be smooth.Motivated by the smoothing approaches,we modify the classical bi-level gradient sequential averaging method to solve the bi-level optimization problem.Under some mild conditions,we obtain the convergence rate of the generated sequence,and then based on the analysis framework of S-FISTA,we show the global convergence rate of the proposed algorithm.
基金This research was supported by the National Natural Science Foundation of China(No.61179033)Collaborative Innovation Center on Beijing Society-Building and Social Governance.
文摘In this paper,we discuss a splitting method for group Lasso.By assuming that the sequence of the step lengths has positive lower bound and positive upper bound(unrelated to the given problem data),we prove its Q-linear rate of convergence of the distance sequence of the iterates to the solution set.Moreover,we make comparisons with convergence of the proximal gradient method analyzed very recently.
文摘In this paper,we consider a block-structured convex optimization model,where in the objective the block variables are nonseparable and they are further linearly coupled in the constraint.For the 2-block case,we propose a number of first-order algorithms to solve this model.First,the alternating direction method of multipliers(ADMM)is extended,assuming that it is easy to optimize the augmented Lagrangian function with one block of variables at each time while fixing the other block.We prove that O(1/t)iteration complexity bound holds under suitable conditions,where t is the number of iterations.If the subroutines of the ADMM cannot be implemented,then we propose new alternative algorithms to be called alternating proximal gradient method of multipliers,alternating gradient projection method of multipliers,and the hybrids thereof.Under suitable conditions,the O(1/t)iteration complexity bound is shown to hold for all the newly proposed algorithms.Finally,we extend the analysis for the ADMM to the general multi-block case.