期刊文献+
共找到1,000篇文章
< 1 2 50 >
每页显示 20 50 100
A Modified PRP-HS Hybrid Conjugate Gradient Algorithm for Solving Unconstrained Optimization Problems 被引量:1
1
作者 LI Xiangli WANG Zhiling LI Binglan 《应用数学》 北大核心 2025年第2期553-564,共12页
In this paper,we propose a three-term conjugate gradient method for solving unconstrained optimization problems based on the Hestenes-Stiefel(HS)conjugate gradient method and Polak-Ribiere-Polyak(PRP)conjugate gradien... In this paper,we propose a three-term conjugate gradient method for solving unconstrained optimization problems based on the Hestenes-Stiefel(HS)conjugate gradient method and Polak-Ribiere-Polyak(PRP)conjugate gradient method.Under the condition of standard Wolfe line search,the proposed search direction is the descent direction.For general nonlinear functions,the method is globally convergent.Finally,numerical results show that the proposed method is efficient. 展开更多
关键词 Conjugate gradient method unconstrained optimization Sufficient descent condition Global convergence
在线阅读 下载PDF
GLOBAL COVERGENCE OF THE NON-QUASI-NEWTON METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS 被引量:6
2
作者 Liu Hongwei Wang Mingjie +1 位作者 Li Jinshan Zhang Xiangsun 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2006年第3期276-288,共13页
In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the ... In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the constituted algorithm with either Wolfe-type or Armijotype line search converges globally and Q-superlinearly if the function to be minimized has Lipschitz continuous gradient. 展开更多
关键词 non-quasi-Newton method inexact line search global convergence unconstrained optimization superlinear convergence.
在线阅读 下载PDF
Bayesian network learning algorithm based on unconstrained optimization and ant colony optimization 被引量:3
3
作者 Chunfeng Wang Sanyang Liu Mingmin Zhu 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2012年第5期784-790,共7页
Structure learning of Bayesian networks is a wellresearched but computationally hard task.For learning Bayesian networks,this paper proposes an improved algorithm based on unconstrained optimization and ant colony opt... Structure learning of Bayesian networks is a wellresearched but computationally hard task.For learning Bayesian networks,this paper proposes an improved algorithm based on unconstrained optimization and ant colony optimization(U-ACO-B) to solve the drawbacks of the ant colony optimization(ACO-B).In this algorithm,firstly,an unconstrained optimization problem is solved to obtain an undirected skeleton,and then the ACO algorithm is used to orientate the edges,thus returning the final structure.In the experimental part of the paper,we compare the performance of the proposed algorithm with ACO-B algorithm.The experimental results show that our method is effective and greatly enhance convergence speed than ACO-B algorithm. 展开更多
关键词 Bayesian network structure learning ant colony optimization unconstrained optimization
在线阅读 下载PDF
Global Convergence of an Extended Descent Algorithm without Line Search for Unconstrained Optimization 被引量:1
4
作者 Cuiling Chen Liling Luo +1 位作者 Caihong Han Yu Chen 《Journal of Applied Mathematics and Physics》 2018年第1期130-137,共8页
In this paper, we extend a descent algorithm without line search for solving unconstrained optimization problems. Under mild conditions, its global convergence is established. Further, we generalize the search directi... In this paper, we extend a descent algorithm without line search for solving unconstrained optimization problems. Under mild conditions, its global convergence is established. Further, we generalize the search direction to more general form, and also obtain the global convergence of corresponding algorithm. The numerical results illustrate that the new algorithm is effective. 展开更多
关键词 unconstrained optimization DESCENT Method Line SEARCH Global CONVERGENCE
在线阅读 下载PDF
A DERIVATIVE-FREE ALGORITHM FOR UNCONSTRAINED OPTIMIZATION 被引量:1
5
作者 Peng Yehui Liu Zhenhai 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2005年第4期491-498,共8页
In this paper a hybrid algorithm which combines the pattern search method and the genetic algorithm for unconstrained optimization is presented. The algorithm is a deterministic pattern search algorithm,but in the sea... In this paper a hybrid algorithm which combines the pattern search method and the genetic algorithm for unconstrained optimization is presented. The algorithm is a deterministic pattern search algorithm,but in the search step of pattern search algorithm,the trial points are produced by a way like the genetic algorithm. At each iterate, by reduplication,crossover and mutation, a finite set of points can be used. In theory,the algorithm is globally convergent. The most stir is the numerical results showing that it can find the global minimizer for some problems ,which other pattern search algorithms don't bear. 展开更多
关键词 unconstrained optimization pattern search method genetic algorithm global minimizer.
在线阅读 下载PDF
A Filled Function with Adjustable Parameters for Unconstrained Global Optimization 被引量:1
6
作者 SHANGYou-lin LIXiao-yan 《Chinese Quarterly Journal of Mathematics》 CSCD 2004年第3期232-239,共8页
A filled function with adjustable parameters is suggested in this paper for finding a global minimum point of a general class of nonlinear programming problems with a bounded and closed domain. This function has two a... A filled function with adjustable parameters is suggested in this paper for finding a global minimum point of a general class of nonlinear programming problems with a bounded and closed domain. This function has two adjustable parameters. We will discuss the properties of the proposed filled function. Conditions on this function and on the values of parameters are given so that the constructed function has the desired properties of traditional filled function. 展开更多
关键词 filled function global optimization global minimizer unconstrained problem BASIN HILL
在线阅读 下载PDF
Design andtwo-objective simultaneous optimization of torque controller for switched reluctance motor in electric vehicle 被引量:1
7
作者 朱曰莹 赵桂范 +1 位作者 王大方 刘仕强 《Journal of Harbin Institute of Technology(New Series)》 EI CAS 2011年第6期79-85,共7页
In order to reduce the torque ripple,increase the average torque and optimize the drive performance of the switched reluctance motor (SRM),the nonlinear dynamic model of SRM is established in the MATLAB /Simulink envi... In order to reduce the torque ripple,increase the average torque and optimize the drive performance of the switched reluctance motor (SRM),the nonlinear dynamic model of SRM is established in the MATLAB /Simulink environment.The effects of the turn-on and turn-off angles are investigated by the simulation results of the dynamic model,and the function is made among the rotor speed,turn-on angle and turn-off angle.To optimize the torque dynamic performance,the two-objective simultaneous optimization function is proposed by two weight factors.And the optimized turn-on and turn-off angles as functions of rotor speed are developed by using the simultaneous optimization method.Then the optimized torque controller is designed based on the optimized turn-on and turn-off angles.The simulation results show that the optimized torque controller designed in this paper can effectively reduce the torque ripple and increase the average torque,and optimize the torque dynamic performance of the SRM. 展开更多
关键词 electric vehicle switched reluctance motor torque controller two-objective simultaneous optimization
在线阅读 下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
8
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
在线阅读 下载PDF
Subspace Minimization Conjugate Gradient Method Based on Cubic Regularization Model for Unconstrained Optimization 被引量:1
9
作者 Ting Zhao Hongwei Liu 《Journal of Harbin Institute of Technology(New Series)》 CAS 2021年第5期61-69,共9页
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ... Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods. 展开更多
关键词 cubic regularization model conjugate gradient method subspace technique unconstrained optimization
在线阅读 下载PDF
A Line Search Algorithm for Unconstrained Optimization 被引量:1
10
作者 Gonglin Yuan Sha Lu Zengxin Wei 《Journal of Software Engineering and Applications》 2010年第5期503-509,共7页
It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, thi... It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently descent property and belongs to a trust region without carrying out any line search rule. Numerical results show that the new method is effective. 展开更多
关键词 LINE SEARCH unconstrained optimization Global CONVERGENCE R-linear CONVERGENCE
在线阅读 下载PDF
New type of conjugate gradient algorithms for unconstrained optimization problems
11
作者 Caiying Wu Guoqing Chen 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第6期1000-1007,共8页
Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient metho... Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient methods, the new methods take both available gradient and function value information. Furthermore, their modifications are proposed. These methods are shown to be global convergent under some assumptions. Numerical results are also reported. 展开更多
关键词 conjugate gradient unconstrained optimization global convergence conjugacy condition.
在线阅读 下载PDF
On the Global Convergence of the PERRY-SHANNO Method for Nonconvex Unconstrained Optimization Problems
12
作者 Linghua Huang Qingjun Wu Gonglin Yuan 《Applied Mathematics》 2011年第3期315-320,共6页
In this paper, we prove the global convergence of the Perry-Shanno’s memoryless quasi-Newton (PSMQN) method with a new inexact line search when applied to nonconvex unconstrained minimization problems. Preliminary nu... In this paper, we prove the global convergence of the Perry-Shanno’s memoryless quasi-Newton (PSMQN) method with a new inexact line search when applied to nonconvex unconstrained minimization problems. Preliminary numerical results show that the PSMQN with the particularly line search conditions are very promising. 展开更多
关键词 unconstrained optimization NONCONVEX optimization GLOBAL CONVERGENCE
在线阅读 下载PDF
A Retrospective Filter Trust Region Algorithm for Unconstrained Optimization
13
作者 Yue Lu Zhongwen Chen 《Applied Mathematics》 2010年第3期179-188,共10页
In this paper, we propose a retrospective filter trust region algorithm for unconstrained optimization, which is based on the framework of the retrospective trust region method and associated with the technique of the... In this paper, we propose a retrospective filter trust region algorithm for unconstrained optimization, which is based on the framework of the retrospective trust region method and associated with the technique of the multi-dimensional filter. The new algorithm gives a good estimation of trust region radius, relaxes the condition of accepting a trial step for the usual trust region methods. Under reasonable assumptions, we analyze the global convergence of the new method and report the preliminary results of numerical tests. We compare the results with those of the basic trust region algorithm, the filter trust region algorithm and the retrospective trust region algorithm, which shows the effectiveness of the new algorithm. 展开更多
关键词 unconstrained optimization RETROSPECTIVE TRUST Region Method MULTI-DIMENSIONAL FILTER Technique
在线阅读 下载PDF
Modified LS Method for Unconstrained Optimization
14
作者 Jinkui Liu Li Zheng 《Applied Mathematics》 2011年第6期779-782,共4页
In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucid... In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucidi line search, the global convergence property of the given method is discussed. The numerical results show that the new method is efficient for the given test problems. 展开更多
关键词 unconstrained optimization CONJUGATE GRADIENT Method Grippo-Lucidi Line SEARCH Global CONVERGENCE
在线阅读 下载PDF
A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
15
作者 Hao Fan Zhibin Zhu Anwa Zhou 《Applied Mathematics》 2011年第9期1119-1123,共5页
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ... In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis. 展开更多
关键词 Large Scale unconstrained optimization CONJUGATE Gradient Method SUFFICIENT DESCENT Property Globally CONVERGENT
在线阅读 下载PDF
An Improved Quasi-Newton Method for Unconstrained Optimization
16
作者 Fei Pusheng Chen Zhong (Department of Mathematics, Wuhan University, Wuhan 430072, China) 《Wuhan University Journal of Natural Sciences》 CAS 1996年第1期35-37,共3页
We present an improved method. If we assume that the objective function is twice continuously differentiable and uniformly convex, we discuss global and superlinear convergence of the improved quasi-Newton method.
关键词 quasi-Newton method superlinear convergence unconstrained optimization
在线阅读 下载PDF
New Variants of Newton’s Method for Nonlinear Unconstrained Optimization Problems
17
作者 V. KANWAR Kapil K. SHARMA Ramandeep BEHL 《Intelligent Information Management》 2010年第1期40-45,共6页
In this paper, we propose new variants of Newton’s method based on quadrature formula and power mean for solving nonlinear unconstrained optimization problems. It is proved that the order of convergence of the propos... In this paper, we propose new variants of Newton’s method based on quadrature formula and power mean for solving nonlinear unconstrained optimization problems. It is proved that the order of convergence of the proposed family is three. Numerical comparisons are made to show the performance of the presented methods. Furthermore, numerical experiments demonstrate that the logarithmic mean Newton’s method outperform the classical Newton’s and other variants of Newton’s method. MSC: 65H05. 展开更多
关键词 unconstrained optimization Newton’s method order of CONVERGENCE power MEANS INITIAL GUESS
在线阅读 下载PDF
A Non-Monotone Trust Region Method with Non-Monotone Wolfe-Type Line Search Strategy for Unconstrained Optimization
18
作者 Changyuan Li Qinghua Zhou Xiao Wu 《Journal of Applied Mathematics and Physics》 2015年第6期707-712,共6页
In this paper, we propose and analyze a non-monotone trust region method with non-monotone line search strategy for unconstrained optimization problems. Unlike the traditional non-monotone trust region method, our alg... In this paper, we propose and analyze a non-monotone trust region method with non-monotone line search strategy for unconstrained optimization problems. Unlike the traditional non-monotone trust region method, our algorithm utilizes non-monotone Wolfe line search to get the next point if a trial step is not adopted. Thus, it can reduce the number of solving sub-problems. Theoretical analysis shows that the new proposed method has a global convergence under some mild conditions. 展开更多
关键词 unconstrained optimization Non-Monotone TRUST Region Method Non-Monotone Line Search Global Convergence
在线阅读 下载PDF
Global Convergence of Curve Search Methods for Unconstrained Optimization
19
作者 Zhiwei Xu Yongning Tang Zhen-Jun Shi 《Applied Mathematics》 2016年第7期721-735,共15页
In this paper we propose a new family of curve search methods for unconstrained optimization problems, which are based on searching a new iterate along a curve through the current iterate at each iteration, while line... In this paper we propose a new family of curve search methods for unconstrained optimization problems, which are based on searching a new iterate along a curve through the current iterate at each iteration, while line search methods are based on finding a new iterate on a line starting from the current iterate at each iteration. The global convergence and linear convergence rate of these curve search methods are investigated under some mild conditions. Numerical results show that some curve search methods are stable and effective in solving some large scale minimization problems. 展开更多
关键词 unconstrained optimization Curve Search Method Global Convergence Convergence Rate
在线阅读 下载PDF
A New Two-Parameter Family of Nonlinear Conjugate Gradient Method Without Line Search for Unconstrained Optimization Problem
20
作者 ZHU Tiefeng 《Wuhan University Journal of Natural Sciences》 CAS CSCD 2024年第5期403-411,共9页
This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on a... This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective. 展开更多
关键词 unconstrained optimization conjugate gradient method without line search global convergence
原文传递
上一页 1 2 50 下一页 到第
使用帮助 返回顶部