This paper modifies the Frank-Wolfe's algorithm. Under weaker conditions it proves that the modified algorithm is convergent, and specially under the assumption of convexity of the objective function that without...This paper modifies the Frank-Wolfe's algorithm. Under weaker conditions it proves that the modified algorithm is convergent, and specially under the assumption of convexity of the objective function that without assuming {x ̄k} is bounded.展开更多
In this paper, we propose a primal-dual interior point method for solving general constrained nonlinear programming problems. To avoid the situation that the algorithm we use may converge to a saddle point or a local ...In this paper, we propose a primal-dual interior point method for solving general constrained nonlinear programming problems. To avoid the situation that the algorithm we use may converge to a saddle point or a local maximum, we utilize a merit function to guide the iterates toward a local minimum. Especially, we add the parameter ε to the Newton system when calculating the decrease directions. The global convergence is achieved by the decrease of a merit function. Furthermore, the numerical results confirm that the algorithm can solve this kind of problems in an efficient way.展开更多
Since the point-to-set maps were introduced by Zangwill in the study of conceptual algorithms, various sufficient conditions for the algorithms to be of global convergence have been established.In this paper, the rela...Since the point-to-set maps were introduced by Zangwill in the study of conceptual algorithms, various sufficient conditions for the algorithms to be of global convergence have been established.In this paper, the relations among all these conditions are illustrated by a unified approach;still more, unlike the sufficient conditions previously given in the literature,a new necessary condition is put forward at the end of the paper, so that it implies more applications.展开更多
In this paper we discuss the degeneracy in nonlinear programming with linear constraints, and give a technique for dealing with degeneracy in a general model of reduced gradient algorithms. Under the assumption that t...In this paper we discuss the degeneracy in nonlinear programming with linear constraints, and give a technique for dealing with degeneracy in a general model of reduced gradient algorithms. Under the assumption that the objective function is continuously differentiable, we prove that either the iterative sequence {xk} generated by the method terminates at a Kuhn-Tucker point after a finite number of iterations, or any cluster point of the sequence {xk} is a KuhnTucker point.展开更多
文摘This paper modifies the Frank-Wolfe's algorithm. Under weaker conditions it proves that the modified algorithm is convergent, and specially under the assumption of convexity of the objective function that without assuming {x ̄k} is bounded.
文摘In this paper, we propose a primal-dual interior point method for solving general constrained nonlinear programming problems. To avoid the situation that the algorithm we use may converge to a saddle point or a local maximum, we utilize a merit function to guide the iterates toward a local minimum. Especially, we add the parameter ε to the Newton system when calculating the decrease directions. The global convergence is achieved by the decrease of a merit function. Furthermore, the numerical results confirm that the algorithm can solve this kind of problems in an efficient way.
文摘Since the point-to-set maps were introduced by Zangwill in the study of conceptual algorithms, various sufficient conditions for the algorithms to be of global convergence have been established.In this paper, the relations among all these conditions are illustrated by a unified approach;still more, unlike the sufficient conditions previously given in the literature,a new necessary condition is put forward at the end of the paper, so that it implies more applications.
文摘In this paper we discuss the degeneracy in nonlinear programming with linear constraints, and give a technique for dealing with degeneracy in a general model of reduced gradient algorithms. Under the assumption that the objective function is continuously differentiable, we prove that either the iterative sequence {xk} generated by the method terminates at a Kuhn-Tucker point after a finite number of iterations, or any cluster point of the sequence {xk} is a KuhnTucker point.