期刊文献+
共找到5,041篇文章
< 1 2 250 >
每页显示 20 50 100
Decomposition for Large-Scale Optimization Problems:An Overview
1
作者 Thai Doan CHUONG Chen LIU Xinghuo YU 《Artificial Intelligence Science and Engineering》 2025年第3期157-174,共18页
Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale opti... Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale optimization problems are solved using computing machines,leading to an enormous computational time being required,which may delay deriving timely solutions.Decomposition methods,which partition a large-scale optimization problem into lower-dimensional subproblems,represent a key approach to addressing time-efficiency issues.There has been significant progress in both applied mathematics and emerging artificial intelligence approaches on this front.This work aims at providing an overview of the decomposition methods from both the mathematics and computer science points of view.We also remark on the state-of-the-art developments and recent applications of the decomposition methods,and discuss the future research and development perspectives. 展开更多
关键词 decomposition methods nonlinear optimization large-scale problems computational intelligence
在线阅读 下载PDF
Variable Reconstruction for Evolutionary Expensive Large-Scale Multiobjective Optimization and Its Application on Aerodynamic Design
2
作者 Jianqing Lin Cheng He +1 位作者 Ye Tian Linqiang Pan 《IEEE/CAA Journal of Automatica Sinica》 2025年第4期719-733,共15页
Expensive multiobjective optimization problems(EMOPs)are complex optimization problems exacted from realworld applications,where each objective function evaluation(FE)involves expensive computations or physical experi... Expensive multiobjective optimization problems(EMOPs)are complex optimization problems exacted from realworld applications,where each objective function evaluation(FE)involves expensive computations or physical experiments.Many surrogate-assisted evolutionary algorithms(SAEAs)have been designed to solve EMOPs.Nevertheless,EMOPs with large-scale decision variables remain challenging for existing SAEAs,leading to difficulties in maintaining convergence and diversity.To address this deficiency,we proposed a variable reconstructionbased SAEA(VREA)to balance convergence enhancement and diversity maintenance.Generally,a cluster-based variable reconstruction strategy reconstructs the original large-scale decision variables into low-dimensional weight variables.Thus,the population can be rapidly pushed towards the Pareto set(PS)by optimizing low-dimensional weight variables with the assistance of surrogate models.Population diversity is improved due to the cluster-based variable reconstruction strategy.An adaptive search step size strategy is proposed to balance exploration and exploitation further.Experimental comparisons with four state-of-the-art SAEAs are conducted on benchmark EMOPs with up to 1000 decision variables and an aerodynamic design task.Experimental results demonstrate that VREA obtains well-converged and diverse solutions with limited real FEs. 展开更多
关键词 Aerodynamic design large-scale optimization multiobjective evolutionary algorithm surrogate model variable reconstruction
在线阅读 下载PDF
Optimization design of launch window for large-scale constellation using improved genetic algorithm
3
作者 LIU Yue HOU Xiangzhen +3 位作者 CAI Xi LI Minghu CHANG Xinya WANG Miao 《先进小卫星技术(中英文)》 2025年第4期23-32,共10页
The research on optimization methods for constellation launch deployment strategies focused on the consideration of mission interval time constraints at the launch site.Firstly,a dynamic modeling of the constellation ... The research on optimization methods for constellation launch deployment strategies focused on the consideration of mission interval time constraints at the launch site.Firstly,a dynamic modeling of the constellation deployment process was established,and the relationship between the deployment window and the phase difference of the orbit insertion point,as well as the cost of phase adjustment after orbit insertion,was derived.Then,the combination of the constellation deployment position sequence was treated as a parameter,together with the sequence of satellite deployment intervals,as optimization variables,simplifying a highdimensional search problem within a wide range of dates to a finite-dimensional integer programming problem.An improved genetic algorithm with local search on deployment dates was introduced to optimize the launch deployment strategy.With the new description of the optimization variables,the total number of elements in the solution space was reduced by N orders of magnitude.Numerical simulation confirms that the proposed optimization method accelerates the convergence speed from hours to minutes. 展开更多
关键词 deployment strategy optimization launching schedule constraints improved genetic algorithm large-scale constellation
在线阅读 下载PDF
Exploring Optimization Strategies for Island Power Grid Line Layout Oriented Towards Large-Scale Distributed Renewable Energy Integration
4
作者 Zhenhuan Song Wenxin Liu 《Proceedings of Business and Economic Studies》 2025年第4期495-502,共8页
The construction of island power grids is a systematic engineering task.To ensure the safe operation of power grid systems,optimizing the line layout of island power grids is crucial.Especially in the current context ... The construction of island power grids is a systematic engineering task.To ensure the safe operation of power grid systems,optimizing the line layout of island power grids is crucial.Especially in the current context of large-scale distributed renewable energy integration into the power grid,conventional island power grid line layouts can no longer meet actual demands.It is necessary to combine the operational characteristics of island power systems and historical load data to perform load forecasting,thereby generating power grid line layout paths.This article focuses on large-scale distributed renewable energy integration,summarizing optimization strategies for island power grid line layouts,and providing a solid guarantee for the safe and stable operation of island power systems. 展开更多
关键词 Island power grid Line layout optimization strategy Distributed renewable energy large-scale
在线阅读 下载PDF
An Adaptive Cubic Regularisation Algorithm Based on Affine Scaling Methods for Constrained Optimization
5
作者 PEI Yonggang WANG Jingyi 《应用数学》 北大核心 2026年第1期258-277,共20页
In this paper,an adaptive cubic regularisation algorithm based on affine scaling methods(ARCBASM)is proposed for solving nonlinear equality constrained programming with nonnegative constraints on variables.From the op... In this paper,an adaptive cubic regularisation algorithm based on affine scaling methods(ARCBASM)is proposed for solving nonlinear equality constrained programming with nonnegative constraints on variables.From the optimality conditions of the problem,we introduce appropriate affine matrix and construct an affine scaling ARC subproblem with linearized constraints.Composite step methods and reduced Hessian methods are applied to tackle the linearized constraints.As a result,a standard unconstrained ARC subproblem is deduced and its solution can supply sufficient decrease.The fraction to the boundary rule maintains the strict feasibility(for nonnegative constraints on variables)of every iteration point.Reflection techniques are employed to prevent the iterations from approaching zero too early.Under mild assumptions,global convergence of the algorithm is analysed.Preliminary numerical results are reported. 展开更多
关键词 Constrained optimization Adaptive cubic regularisation Affine scaling global convergence
在线阅读 下载PDF
A Novel Variable-Fidelity Kriging Surrogate Model Based on Global Optimization for Black-Box Problems
6
作者 Yi Guan Pengpeng Zhi Zhonglai Wang 《Computer Modeling in Engineering & Sciences》 2025年第9期3343-3368,共26页
Variable-fidelity(VF)surrogate models have received increasing attention in engineering design optimization as they can approximate expensive high-fidelity(HF)simulations with reduced computational power.A key challen... Variable-fidelity(VF)surrogate models have received increasing attention in engineering design optimization as they can approximate expensive high-fidelity(HF)simulations with reduced computational power.A key challenge to building a VF model is devising an adaptive model updating strategy that jointly selects additional low-fidelity(LF)and/or HF samples.The additional samples must enhance the model accuracy while maximizing the computational efficiency.We propose ISMA-VFEEI,a global optimization framework that integrates an Improved Slime-Mould Algorithm(ISMA)and a Variable-Fidelity Expected Extension Improvement(VFEEI)learning function to construct a VF surrogate model efficiently.First,A cost-aware VFEEI function guides the adaptive LF/HF sampling by explicitly incorporating evaluation cost and existing sample proximity.Second,ISMA is employed to solve the resulting non-convex optimization problem and identify global optimal infill points for model enhancement.The efficacy of ISMA-VFEEI is demonstrated through six numerical benchmarks and one real-world engineering case study.The engineering case study of a high-speed railway Electric Multiple Unit(EMU),the optimization objective of a sanding device attained a minimum value of 1.546 using only 20 HF evaluations,outperforming all the compared methods. 展开更多
关键词 global optimization KRIGING variable-fidelity model slime mould algorithm expected improvement
在线阅读 下载PDF
A Comprehensive Review of Sizing and Allocation of Distributed Power Generation:Optimization Techniques,Global Insights,and Smart Grid Implications
7
作者 Abdullrahman A.Al-Shamma’a Hassan M.Hussein Farh +4 位作者 Ridwan Taiwo Al-Wesabi Ibrahim Abdulrhman Alshaabani Saad Mekhilef Mohamed A.Mohamed 《Computer Modeling in Engineering & Sciences》 2025年第11期1303-1347,共45页
Optimal sizing and allocation of distributed generators(DGs)have become essential computational challenges in improving the performance,efficiency,and reliability of electrical distribution networks.Despite extensive ... Optimal sizing and allocation of distributed generators(DGs)have become essential computational challenges in improving the performance,efficiency,and reliability of electrical distribution networks.Despite extensive research,existing approaches often face algorithmic limitations such as slow convergence,premature stagnation in local minima,or suboptimal accuracy in determining optimal DG placement and capacity.This study presents a comprehensive scientometric and systematic review of global research focused on computer-based modelling and algorithmic optimization for renewable DG sizing and placement.It integrates both quantitative and qualitative analyses of the scholarly landscape,mapping influential research domains,co-authorship structures,the articles’citation networks,keyword clusters,and international collaboration patterns.Moreover,the study classifies and evaluates the most prominent objective functions,key computational models and optimization algorithms,DG technologies,and strategic approaches employed in the field.The findings reveal that advanced algorithmic frameworks substantially enhance network stability,minimize real power losses,and improve voltage profiles under various operational constraints.This review serves as a foundational resource for researchers and practitioners,highlighting emerging algorithmic trends,modelling innovations,and data-driven methodologies that can guide future development of intelligent,optimization-based DG integration strategies in smart distribution systems. 展开更多
关键词 Systematic and scientometric global trends distributed generation sizing and allocation multiobjectives modelling and algorithmic optimization
在线阅读 下载PDF
Integrating Conjugate Gradients Into Evolutionary Algorithms for Large-Scale Continuous Multi-Objective Optimization 被引量:7
8
作者 Ye Tian Haowen Chen +3 位作者 Haiping Ma Xingyi Zhang Kay Chen Tan Yaochu Jin 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第10期1801-1817,共17页
Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms a... Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms are good at solving small-scale multi-objective optimization problems,they are criticized for low efficiency in converging to the optimums of LSMOPs.By contrast,mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems,but they have difficulties in finding diverse solutions for LSMOPs.Currently,how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored.In this paper,a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method.On the one hand,conjugate gradients and differential evolution are used to update different decision variables of a set of solutions,where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front.On the other hand,objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions,and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent.In comparison with state-of-the-art evolutionary algorithms,mathematical programming methods,and hybrid algorithms,the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs. 展开更多
关键词 Conjugate gradient differential evolution evolutionary computation large-scale multi-objective optimization mathematical programming
在线阅读 下载PDF
Evolutionary Computation for Large-scale Multi-objective Optimization: A Decade of Progresses 被引量:6
9
作者 Wen-Jing Hong Peng Yang Ke Tang 《International Journal of Automation and computing》 EI CSCD 2021年第2期155-169,共15页
Large-scale multi-objective optimization problems(MOPs)that involve a large number of decision variables,have emerged from many real-world applications.While evolutionary algorithms(EAs)have been widely acknowledged a... Large-scale multi-objective optimization problems(MOPs)that involve a large number of decision variables,have emerged from many real-world applications.While evolutionary algorithms(EAs)have been widely acknowledged as a mainstream method for MOPs,most research progress and successful applications of EAs have been restricted to MOPs with small-scale decision variables.More recently,it has been reported that traditional multi-objective EAs(MOEAs)suffer severe deterioration with the increase of decision variables.As a result,and motivated by the emergence of real-world large-scale MOPs,investigation of MOEAs in this aspect has attracted much more attention in the past decade.This paper reviews the progress of evolutionary computation for large-scale multi-objective optimization from two angles.From the key difficulties of the large-scale MOPs,the scalability analysis is discussed by focusing on the performance of existing MOEAs and the challenges induced by the increase of the number of decision variables.From the perspective of methodology,the large-scale MOEAs are categorized into three classes and introduced respectively:divide and conquer based,dimensionality reduction based and enhanced search-based approaches.Several future research directions are also discussed. 展开更多
关键词 large-scale multi-objective optimization high-dimensional search space evolutionary computation evolutionary algorithms SCALABILITY
原文传递
Modified Augmented Lagrange Multiplier Methods for Large-Scale Chemical Process Optimization 被引量:6
10
作者 梁昔明 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2001年第2期167-172,共6页
Chemical process optimization can be described as large-scale nonlinear constrained minimization. The modified augmented Lagrange multiplier methods (MALMM) for large-scale nonlinear constrained minimization are studi... Chemical process optimization can be described as large-scale nonlinear constrained minimization. The modified augmented Lagrange multiplier methods (MALMM) for large-scale nonlinear constrained minimization are studied in this paper. The Lagrange function contains the penalty terms on equality and inequality constraints and the methods can be applied to solve a series of bound constrained sub-problems instead of a series of unconstrained sub-problems. The steps of the methods are examined in full detail. Numerical experiments are made for a variety of problems, from small to very large-scale, which show the stability and effectiveness of the methods in large-scale problems. 展开更多
关键词 modified augmented Lagrange multiplier methods chemical engineering optimization large-scale non- linear constrained minimization numerical experiment
在线阅读 下载PDF
A Two-Layer Encoding Learning Swarm Optimizer Based on Frequent Itemsets for Sparse Large-Scale Multi-Objective Optimization 被引量:2
11
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Xu Yang Ruiqing Sun Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1342-1357,共16页
Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.... Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed. 展开更多
关键词 Evolutionary algorithms learning swarm optimiza-tion sparse large-scale optimization sparse large-scale multi-objec-tive problems two-layer encoding.
在线阅读 下载PDF
Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems 被引量:2
12
作者 仲卫涛 邵之江 +1 位作者 张余岳 钱积新 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2000年第3期212-217,共6页
The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Com... The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Computational results on two typical chemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy is promising and suitable for large-scale process optimization problems. 展开更多
关键词 large-scale optimization open-equation sequential quadratic programming analytical derivative sparse matrix technique
在线阅读 下载PDF
Enhanced Butterfly Optimization Algorithm for Large-Scale Optimization Problems 被引量:1
13
作者 Yu Li Xiaomei Yu Jingsen Liu 《Journal of Bionic Engineering》 SCIE EI CSCD 2022年第2期554-570,共17页
To solve large-scale optimization problems,Fragrance coefficient and variant Particle Swarm local search Butterfly Optimization Algorithm(FPSBOA)is proposed.In the position update stage of Butterfly Optimization Algor... To solve large-scale optimization problems,Fragrance coefficient and variant Particle Swarm local search Butterfly Optimization Algorithm(FPSBOA)is proposed.In the position update stage of Butterfly Optimization Algorithm(BOA),the fragrance coefficient is designed to balance the exploration and exploitation of BOA.The variant particle swarm local search strategy is proposed to improve the local search ability of the current optimal butterfly and prevent the algorithm from falling into local optimality.192000-dimensional functions and 201000-dimensional CEC 2010 large-scale functions are used to verify FPSBOA for complex large-scale optimization problems.The experimental results are statistically analyzed by Friedman test and Wilcoxon rank-sum test.All attained results demonstrated that FPSBOA can better solve more challenging scientific and industrial real-world problems with thousands of variables.Finally,four mechanical engineering problems and one ten-dimensional process synthesis and design problem are applied to FPSBOA,which shows FPSBOA has the feasibility and effectiveness in real-world application problems. 展开更多
关键词 Butterfy optimization algorithm Fragrance coefcient Variant particle swarm local search large-scale optimization problems Real-world application problems
在线阅读 下载PDF
A SPARSE SUBSPACE TRUNCATED NEWTON METHOD FOR LARGE-SCALE BOUND CONSTRAINED NONLINEAR OPTIMIZATION
14
作者 倪勤 《Numerical Mathematics A Journal of Chinese Universities(English Series)》 SCIE 1997年第1期27-37,共11页
In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices ou... In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given. 展开更多
关键词 The TRUNCATED NEWTON method large-scale SPARSE problems BOUND constrained nonlinear optimization.
在线阅读 下载PDF
Enhancing Evolutionary Algorithms With Pattern Mining for Sparse Large-Scale Multi-Objective Optimization Problems
15
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Weixiong Huang Fan Yu Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第8期1786-1801,共16页
Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to tr... Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges. 展开更多
关键词 Evolutionary algorithms pattern mining sparse large-scale multi-objective problems(SLMOPs) sparse large-scale optimization.
在线阅读 下载PDF
A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization
16
作者 Wujie Hu Gonglin Yuan Hongtruong Pham 《Computers, Materials & Continua》 SCIE EI 2020年第2期787-800,共14页
It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth... It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems.The perfect algorithm stems from concept of‘bundle’successfully addresses both smooth and nonsmooth complex problems,but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle.The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization.Thus,a modified three-term conjugate gradient algorithm was proposed,and it has a sufficiently descent property and a trust region character.At the same time,it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms. 展开更多
关键词 Conjugate gradient large-scale trust region global convergence
在线阅读 下载PDF
Large-Scale Multi-Objective Optimization Algorithm Based on Weighted Overlapping Grouping of Decision Variables
17
作者 Liang Chen Jingbo Zhang +2 位作者 Linjie Wu Xingjuan Cai Yubin Xu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第7期363-383,共21页
The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the intera... The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage. 展开更多
关键词 Decision variable grouping large-scale multi-objective optimization algorithms weighted overlapping grouping direction-guided evolution
在线阅读 下载PDF
A Fast and Efficient Global Router for Congestion Optimization 被引量:2
18
作者 许静宇 鲍海云 +3 位作者 洪先龙 蔡懿慈 经彤 顾钧 《Journal of Semiconductors》 EI CAS CSCD 北大核心 2002年第2期136-142,共7页
An efficient parallel global router using random optimization that is independent of net ordering is proposed.Parallel approaches are described and strategies guaranteeing the routing quality are discussed.The wire le... An efficient parallel global router using random optimization that is independent of net ordering is proposed.Parallel approaches are described and strategies guaranteeing the routing quality are discussed.The wire length model is implemented on multiprocessor,which enables the algorithm to approach feasibility of large scale problems.Timing driven model on multiprocessor and wire length model on distributed processors are also presented.The parallel algorithm greatly reduces the run time of routing.The experimental results show good speedups with no degradation of the routing quality. 展开更多
关键词 global routing congestion optimizing global routing graph (GRG) parallel algorithm
在线阅读 下载PDF
CHAOTIC ANNEALING NEURAL NETWORK FOR GLOBAL OPTIMIZATION OF CONSTRAINED NONLINEAR PROGRAMMING 被引量:1
19
作者 张国平 王正欧 袁国林 《Transactions of Tianjin University》 EI CAS 2001年第3期141-146,共6页
Chaotic neural networks have global searching ability.But their applications are generally confined to combinatorial optimization to date.By introducing chaotic noise annealing process into conventional Hopfield netwo... Chaotic neural networks have global searching ability.But their applications are generally confined to combinatorial optimization to date.By introducing chaotic noise annealing process into conventional Hopfield network,this paper proposes a new chaotic annealing neural network (CANN) for global optimization of continuous constrained non linear programming.It is easy to implement,conceptually simple,and generally applicable.Numerical experiments on severe test functions manifest that CANN is efficient and reliable to search for global optimum and outperforms the existing genetic algorithm GAMAS for the same purpose. 展开更多
关键词 global optimization neural network chaotic noise annealing
在线阅读 下载PDF
基于Global optimization寻找无向完全图的最小生成树
20
作者 姚坤 刘希玉 李菲菲 《山东科学》 CAS 2006年第2期48-50,62,共4页
将Global optimization思想引入到寻找无向完全图最小生成树的问题中,提出了Global optimization算法。与Kruskal算法和Prim算法相比之下,此算法避免了求解过程中对生成树中是否出现回路的判断,并在一定程度上降低了时间复杂度。
关键词 global optimization算法 无向完全图 最小生成树
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部