Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically o...Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically only guarantee convergence to local minima.This work presents Hamilton-Jacobi-based Moreau adaptive descent(HJ-MAD),a zero-order algorithm with guaranteed convergence to global minima,assuming continuity of the objective function.The core idea is to compute gradients of the Moreau envelope of the objective(which is"piece-wise convex")with adaptive smoothing parameters.Gradients of the Moreau envelope(i.e.,proximal operators)are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation.Our numerical examples illustrate global convergence.展开更多
In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective func...In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective function is a classical strongly convex function which may not be smooth.Motivated by the smoothing approaches,we modify the classical bi-level gradient sequential averaging method to solve the bi-level optimization problem.Under some mild conditions,we obtain the convergence rate of the generated sequence,and then based on the analysis framework of S-FISTA,we show the global convergence rate of the proposed algorithm.展开更多
基金partially funded by AFOSR MURI FA9550-18-502,ONR N00014-18-1-2527,N00014-18-20-1-2093,N00014-20-1-2787supported by the NSF Graduate Research Fellowship under Grant No.DGE-1650604.
文摘Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically only guarantee convergence to local minima.This work presents Hamilton-Jacobi-based Moreau adaptive descent(HJ-MAD),a zero-order algorithm with guaranteed convergence to global minima,assuming continuity of the objective function.The core idea is to compute gradients of the Moreau envelope of the objective(which is"piece-wise convex")with adaptive smoothing parameters.Gradients of the Moreau envelope(i.e.,proximal operators)are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation.Our numerical examples illustrate global convergence.
文摘In this paper,we consider a more general bi-level optimization problem,where the inner objective function is consisted of three convex functions,involving a smooth and two non-smooth functions.The outer objective function is a classical strongly convex function which may not be smooth.Motivated by the smoothing approaches,we modify the classical bi-level gradient sequential averaging method to solve the bi-level optimization problem.Under some mild conditions,we obtain the convergence rate of the generated sequence,and then based on the analysis framework of S-FISTA,we show the global convergence rate of the proposed algorithm.