In this paper,we propose a modified two-subgradient extragradient algorithm(MTSEGA)for solving monotone and Lipschitz continuous variational inequalities with the feasible set being a level set of a smooth convex func...In this paper,we propose a modified two-subgradient extragradient algorithm(MTSEGA)for solving monotone and Lipschitz continuous variational inequalities with the feasible set being a level set of a smooth convex function in Hilbert space.The advantage of MTSEGA is that all the projections are computed onto a half-space per iteration.Moreover,MTSEGA only needs one computation of the underlying mapping per iteration.Under the same assumptions with the known algorithm,we show that the sequence generated by this algorithm is weakly convergent to a solution of the concerned problem.展开更多
Gibali[J.Nonlinear Anal.Optim.,2015,6(1):41‒51]presented a self-adaptive subgradient extragradient projection method for solving variational inequalities without Lipschitz continuity,where its next iterative point was...Gibali[J.Nonlinear Anal.Optim.,2015,6(1):41‒51]presented a self-adaptive subgradient extragradient projection method for solving variational inequalities without Lipschitz continuity,where its next iterative point was obtained by projecting a vector onto a specific half-space.In this paper,we present new kinds of self-adaptive subgradient extragradient projection methods by using a new descent direction.With the help of the techniques in the method of He and Liao[J.Optim.Theory Appl,2002,112(1):111‒128],we get a longer step-size for these kinds of algorithms,which proves the global convergence of the generated sequence.Numerical results show that these kinds of extragradient subgradient projection methods are less dependent on the choice of the initial point,the dimension of the variational inequalities,and the tolerance of accuracy than the known methods.Moreover,the new methods proposed in this paper outperform(with respect to the number of iterations and cpu-time)the method presented by Gibali.展开更多
基金Supported by the National Natural Science Foundation of China(Grant Nos.1187105911801455)+1 种基金Sichuan Science and Technology Program(Grant No.2019YFG0299)General Cultivation Program of China West Normal University(Grant No.20A024)。
文摘In this paper,we propose a modified two-subgradient extragradient algorithm(MTSEGA)for solving monotone and Lipschitz continuous variational inequalities with the feasible set being a level set of a smooth convex function in Hilbert space.The advantage of MTSEGA is that all the projections are computed onto a half-space per iteration.Moreover,MTSEGA only needs one computation of the underlying mapping per iteration.Under the same assumptions with the known algorithm,we show that the sequence generated by this algorithm is weakly convergent to a solution of the concerned problem.
文摘Gibali[J.Nonlinear Anal.Optim.,2015,6(1):41‒51]presented a self-adaptive subgradient extragradient projection method for solving variational inequalities without Lipschitz continuity,where its next iterative point was obtained by projecting a vector onto a specific half-space.In this paper,we present new kinds of self-adaptive subgradient extragradient projection methods by using a new descent direction.With the help of the techniques in the method of He and Liao[J.Optim.Theory Appl,2002,112(1):111‒128],we get a longer step-size for these kinds of algorithms,which proves the global convergence of the generated sequence.Numerical results show that these kinds of extragradient subgradient projection methods are less dependent on the choice of the initial point,the dimension of the variational inequalities,and the tolerance of accuracy than the known methods.Moreover,the new methods proposed in this paper outperform(with respect to the number of iterations and cpu-time)the method presented by Gibali.