期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
Adaptive Multi-strategy Rabbit Optimizer for Large-scale Optimization
1
作者 Baowei Xiang Yixin Xiang 《Journal of Bionic Engineering》 2025年第1期398-416,共19页
As optimization problems continue to grow in complexity,the need for effective metaheuristic algorithms becomes increasingly evident.However,the challenge lies in identifying the right parameters and strategies for th... As optimization problems continue to grow in complexity,the need for effective metaheuristic algorithms becomes increasingly evident.However,the challenge lies in identifying the right parameters and strategies for these algorithms.In this paper,we introduce the adaptive multi-strategy Rabbit Algorithm(RA).RA is inspired by the social interactions of rabbits,incorporating elements such as exploration,exploitation,and adaptation to address optimization challenges.It employs three distinct subgroups,comprising male,female,and child rabbits,to execute a multi-strategy search.Key parameters,including distance factor,balance factor,and learning factor,strike a balance between precision and computational efficiency.We offer practical recommendations for fine-tuning five essential RA parameters,making them versatile and independent.RA is capable of autonomously selecting adaptive parameter settings and mutation strategies,enabling it to successfully tackle a range of 17 CEC05 benchmark functions with dimensions scaling up to 5000.The results underscore RA’s superior performance in large-scale optimization tasks,surpassing other state-of-the-art metaheuristics in convergence speed,computational precision,and scalability.Finally,RA has demonstrated its proficiency in solving complicated optimization problems in real-world engineering by completing 10 problems in CEC2020. 展开更多
关键词 Adaptive parameter large scale optimization Rabbit algorithm Swarm intelligence Engineering optimization
在线阅读 下载PDF
A SUPERLINEARLY CONVERGENT SPLITTING FEASIBLE SEQUENTIAL QUADRATIC OPTIMIZATION METHOD FOR TWO-BLOCK LARGE-SCALE SMOOTH OPTIMIZATION
2
作者 简金宝 张晨 刘鹏杰 《Acta Mathematica Scientia》 SCIE CSCD 2023年第1期1-24,共24页
This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method fo... This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising. 展开更多
关键词 large scale optimization two-block smooth optimization splitting method feasible sequential quadratic optimization method superlinear convergence
在线阅读 下载PDF
Simplified Group Search Optimizer Algorithm for Large Scale Global Optimization 被引量:1
3
作者 张雯雰 《Journal of Shanghai Jiaotong university(Science)》 EI 2015年第1期38-43,共6页
A simplified group search optimizer algorithm denoted as"SGSO"for large scale global optimization is presented in this paper to obtain a simple algorithm with superior performance on high-dimensional problem... A simplified group search optimizer algorithm denoted as"SGSO"for large scale global optimization is presented in this paper to obtain a simple algorithm with superior performance on high-dimensional problems.The SGSO adopts an improved sharing strategy which shares information of not only the best member but also the other good members,and uses a simpler search method instead of searching by the head angle.Furthermore,the SGSO increases the percentage of scroungers to accelerate convergence speed.Compared with genetic algorithm(GA),particle swarm optimizer(PSO)and group search optimizer(GSO),SGSO is tested on seven benchmark functions with dimensions 30,100,500 and 1 000.It can be concluded that the SGSO has a remarkably superior performance to GA,PSO and GSO for large scale global optimization. 展开更多
关键词 evolutionary algorithms swarm intelli-gence group search optimizer(PSO) large scale global optimization function optimization
原文传递
SEQUENTIAL CONVEX PROGRAMMING METHODS FOR SOLVING LARGE TOPOLOGY OPTIMIZATION PROBLEMS: IMPLEMENTATION AND COMPUTATIONAL RESULTS
4
作者 Qin Ni Ch.Zillober K.Schittkowski 《Journal of Computational Mathematics》 SCIE EI CSCD 2005年第5期491-502,共12页
In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex s... In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex subproblems. The SCP algorithm and the topology optimization approach are introduced. Especially, different strategies to solve certain linear systems of equations are analyzed. Numerical results are presented to show the efficiency of the proposed method for solving topology optimization problems and to compare different variants. 展开更多
关键词 large scale optimization Topology optimization Sequential convex programming method Predictor-corrector interior point method Method of moving asymptotes
原文传递
DATA PREORDERING IN GENERALIZED PAV ALGORITHM FOR MONOTONIC REGRESSION
5
作者 Oleg Burdakov Anders Grimvall Oleg Sysoev 《Journal of Computational Mathematics》 SCIE CSCD 2006年第6期771-790,共20页
Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partiaily ordered data set of observations. In our recent publication [In Ser. Nonconvex Optimization and Its Applicat... Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partiaily ordered data set of observations. In our recent publication [In Ser. Nonconvex Optimization and Its Applications, Springer-Verlag, (2006) 83, pp. 25-33], the Pool-Adjazent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called CPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The CPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted CPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders. 展开更多
关键词 Quadratic programming large scale optimization Least distance problem Monotonic regression Partially ordered data set Pool-adjacent-violators algorithm.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部