As optimization problems continue to grow in complexity,the need for effective metaheuristic algorithms becomes increasingly evident.However,the challenge lies in identifying the right parameters and strategies for th...As optimization problems continue to grow in complexity,the need for effective metaheuristic algorithms becomes increasingly evident.However,the challenge lies in identifying the right parameters and strategies for these algorithms.In this paper,we introduce the adaptive multi-strategy Rabbit Algorithm(RA).RA is inspired by the social interactions of rabbits,incorporating elements such as exploration,exploitation,and adaptation to address optimization challenges.It employs three distinct subgroups,comprising male,female,and child rabbits,to execute a multi-strategy search.Key parameters,including distance factor,balance factor,and learning factor,strike a balance between precision and computational efficiency.We offer practical recommendations for fine-tuning five essential RA parameters,making them versatile and independent.RA is capable of autonomously selecting adaptive parameter settings and mutation strategies,enabling it to successfully tackle a range of 17 CEC05 benchmark functions with dimensions scaling up to 5000.The results underscore RA’s superior performance in large-scale optimization tasks,surpassing other state-of-the-art metaheuristics in convergence speed,computational precision,and scalability.Finally,RA has demonstrated its proficiency in solving complicated optimization problems in real-world engineering by completing 10 problems in CEC2020.展开更多
This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method fo...This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.展开更多
A simplified group search optimizer algorithm denoted as"SGSO"for large scale global optimization is presented in this paper to obtain a simple algorithm with superior performance on high-dimensional problem...A simplified group search optimizer algorithm denoted as"SGSO"for large scale global optimization is presented in this paper to obtain a simple algorithm with superior performance on high-dimensional problems.The SGSO adopts an improved sharing strategy which shares information of not only the best member but also the other good members,and uses a simpler search method instead of searching by the head angle.Furthermore,the SGSO increases the percentage of scroungers to accelerate convergence speed.Compared with genetic algorithm(GA),particle swarm optimizer(PSO)and group search optimizer(GSO),SGSO is tested on seven benchmark functions with dimensions 30,100,500 and 1 000.It can be concluded that the SGSO has a remarkably superior performance to GA,PSO and GSO for large scale global optimization.展开更多
In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex s...In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex subproblems. The SCP algorithm and the topology optimization approach are introduced. Especially, different strategies to solve certain linear systems of equations are analyzed. Numerical results are presented to show the efficiency of the proposed method for solving topology optimization problems and to compare different variants.展开更多
Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partiaily ordered data set of observations. In our recent publication [In Ser. Nonconvex Optimization and Its Applicat...Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partiaily ordered data set of observations. In our recent publication [In Ser. Nonconvex Optimization and Its Applications, Springer-Verlag, (2006) 83, pp. 25-33], the Pool-Adjazent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called CPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The CPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted CPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders.展开更多
文摘As optimization problems continue to grow in complexity,the need for effective metaheuristic algorithms becomes increasingly evident.However,the challenge lies in identifying the right parameters and strategies for these algorithms.In this paper,we introduce the adaptive multi-strategy Rabbit Algorithm(RA).RA is inspired by the social interactions of rabbits,incorporating elements such as exploration,exploitation,and adaptation to address optimization challenges.It employs three distinct subgroups,comprising male,female,and child rabbits,to execute a multi-strategy search.Key parameters,including distance factor,balance factor,and learning factor,strike a balance between precision and computational efficiency.We offer practical recommendations for fine-tuning five essential RA parameters,making them versatile and independent.RA is capable of autonomously selecting adaptive parameter settings and mutation strategies,enabling it to successfully tackle a range of 17 CEC05 benchmark functions with dimensions scaling up to 5000.The results underscore RA’s superior performance in large-scale optimization tasks,surpassing other state-of-the-art metaheuristics in convergence speed,computational precision,and scalability.Finally,RA has demonstrated its proficiency in solving complicated optimization problems in real-world engineering by completing 10 problems in CEC2020.
基金supported by the National Natural Science Foundation of China(12171106)the Natural Science Foundation of Guangxi Province(2020GXNSFDA238017 and 2018GXNSFFA281007)the Shanghai Sailing Program(21YF1430300)。
文摘This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.
基金the Science and Technology Planning Project of Hunan Province(No.2011TP4016-3)the Construct Program of the Key Discipline(Technology of Computer Application)in Xiangnan University
文摘A simplified group search optimizer algorithm denoted as"SGSO"for large scale global optimization is presented in this paper to obtain a simple algorithm with superior performance on high-dimensional problems.The SGSO adopts an improved sharing strategy which shares information of not only the best member but also the other good members,and uses a simpler search method instead of searching by the head angle.Furthermore,the SGSO increases the percentage of scroungers to accelerate convergence speed.Compared with genetic algorithm(GA),particle swarm optimizer(PSO)and group search optimizer(GSO),SGSO is tested on seven benchmark functions with dimensions 30,100,500 and 1 000.It can be concluded that the SGSO has a remarkably superior performance to GA,PSO and GSO for large scale global optimization.
基金This work was mainly done while the first author was visiting the University of Bayreuth, and was supported by the Chinese Scholarship Council, German Academic Exchange Service (DAAD) and the National Natural Science Foundation of China.
文摘In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex subproblems. The SCP algorithm and the topology optimization approach are introduced. Especially, different strategies to solve certain linear systems of equations are analyzed. Numerical results are presented to show the efficiency of the proposed method for solving topology optimization problems and to compare different variants.
文摘Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partiaily ordered data set of observations. In our recent publication [In Ser. Nonconvex Optimization and Its Applications, Springer-Verlag, (2006) 83, pp. 25-33], the Pool-Adjazent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called CPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The CPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted CPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders.