期刊文献+
共找到773篇文章
< 1 2 39 >
每页显示 20 50 100
A Two-Layer Encoding Learning Swarm Optimizer Based on Frequent Itemsets for Sparse Large-Scale Multi-Objective Optimization 被引量:3
1
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Xu Yang Ruiqing Sun Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1342-1357,共16页
Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.... Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed. 展开更多
关键词 Evolutionary algorithms learning swarm optimiza-tion sparse large-scale optimization sparse large-scale multi-objec-tive problems two-layer encoding.
在线阅读 下载PDF
A Fast Clustering Based Evolutionary Algorithm for Super-Large-Scale Sparse Multi-Objective Optimization 被引量:8
2
作者 Ye Tian Yuandong Feng +1 位作者 Xingyi Zhang Changyin Sun 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第4期1048-1063,共16页
During the last three decades,evolutionary algorithms(EAs)have shown superiority in solving complex optimization problems,especially those with multiple objectives and non-differentiable landscapes.However,due to the ... During the last three decades,evolutionary algorithms(EAs)have shown superiority in solving complex optimization problems,especially those with multiple objectives and non-differentiable landscapes.However,due to the stochastic search strategies,the performance of most EAs deteriorates drastically when handling a large number of decision variables.To tackle the curse of dimensionality,this work proposes an efficient EA for solving super-large-scale multi-objective optimization problems with sparse optimal solutions.The proposed algorithm estimates the sparse distribution of optimal solutions by optimizing a binary vector for each solution,and provides a fast clustering method to highly reduce the dimensionality of the search space.More importantly,all the operations related to the decision variables only contain several matrix calculations,which can be directly accelerated by GPUs.While existing EAs are capable of handling fewer than 10000 real variables,the proposed algorithm is verified to be effective in handling 1000000 real variables.Furthermore,since the proposed algorithm handles the large number of variables via accelerated matrix calculations,its runtime can be reduced to less than 10%of the runtime of existing EAs. 展开更多
关键词 Evolutionary computation fast clustering sparse multi-objective optimization super-large-scale optimization
在线阅读 下载PDF
Enhancing Evolutionary Algorithms With Pattern Mining for Sparse Large-Scale Multi-Objective Optimization Problems 被引量:1
3
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Weixiong Huang Fan Yu Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第8期1786-1801,共16页
Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to tr... Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges. 展开更多
关键词 Evolutionary algorithms pattern mining sparse large-scale multi-objective problems(SLMOPs) sparse large-scale optimization.
在线阅读 下载PDF
Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems 被引量:2
4
作者 仲卫涛 邵之江 +1 位作者 张余岳 钱积新 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2000年第3期212-217,共6页
The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Com... The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Computational results on two typical chemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy is promising and suitable for large-scale process optimization problems. 展开更多
关键词 large-scale optimization open-equation sequential quadratic programming analytical derivative sparse matrix technique
在线阅读 下载PDF
Federated Multi-Label Feature Selection via Dual-Layer Hybrid Breeding Cooperative Particle Swarm Optimization with Manifold and Sparsity Regularization
5
作者 Songsong Zhang Huazhong Jin +5 位作者 Zhiwei Ye Jia Yang Jixin Zhang Dongfang Wu Xiao Zheng Dingfeng Song 《Computers, Materials & Continua》 2026年第1期1141-1159,共19页
Multi-label feature selection(MFS)is a crucial dimensionality reduction technique aimed at identifying informative features associated with multiple labels.However,traditional centralized methods face significant chal... Multi-label feature selection(MFS)is a crucial dimensionality reduction technique aimed at identifying informative features associated with multiple labels.However,traditional centralized methods face significant challenges in privacy-sensitive and distributed settings,often neglecting label dependencies and suffering from low computational efficiency.To address these issues,we introduce a novel framework,Fed-MFSDHBCPSO—federated MFS via dual-layer hybrid breeding cooperative particle swarm optimization algorithm with manifold and sparsity regularization(DHBCPSO-MSR).Leveraging the federated learning paradigm,Fed-MFSDHBCPSO allows clients to perform local feature selection(FS)using DHBCPSO-MSR.Locally selected feature subsets are encrypted with differential privacy(DP)and transmitted to a central server,where they are securely aggregated and refined through secure multi-party computation(SMPC)until global convergence is achieved.Within each client,DHBCPSO-MSR employs a dual-layer FS strategy.The inner layer constructs sample and label similarity graphs,generates Laplacian matrices to capture the manifold structure between samples and labels,and applies L2,1-norm regularization to sparsify the feature subset,yielding an optimized feature weight matrix.The outer layer uses a hybrid breeding cooperative particle swarm optimization algorithm to further refine the feature weight matrix and identify the optimal feature subset.The updated weight matrix is then fed back to the inner layer for further optimization.Comprehensive experiments on multiple real-world multi-label datasets demonstrate that Fed-MFSDHBCPSO consistently outperforms both centralized and federated baseline methods across several key evaluation metrics. 展开更多
关键词 Multi-label feature selection federated learning manifold regularization sparse constraints hybrid breeding optimization algorithm particle swarm optimizatio algorithm privacy protection
在线阅读 下载PDF
A SPARSE SUBSPACE TRUNCATED NEWTON METHOD FOR LARGE-SCALE BOUND CONSTRAINED NONLINEAR OPTIMIZATION
6
作者 倪勤 《Numerical Mathematics A Journal of Chinese Universities(English Series)》 SCIE 1997年第1期27-37,共11页
In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices ou... In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given. 展开更多
关键词 The TRUNCATED NEWTON method large-scale sparse problems BOUND constrained nonlinear optimization.
在线阅读 下载PDF
Decomposition for Large-Scale Optimization Problems:An Overview
7
作者 Thai Doan CHUONG Chen LIU Xinghuo YU 《Artificial Intelligence Science and Engineering》 2025年第3期157-174,共18页
Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale opti... Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale optimization problems are solved using computing machines,leading to an enormous computational time being required,which may delay deriving timely solutions.Decomposition methods,which partition a large-scale optimization problem into lower-dimensional subproblems,represent a key approach to addressing time-efficiency issues.There has been significant progress in both applied mathematics and emerging artificial intelligence approaches on this front.This work aims at providing an overview of the decomposition methods from both the mathematics and computer science points of view.We also remark on the state-of-the-art developments and recent applications of the decomposition methods,and discuss the future research and development perspectives. 展开更多
关键词 decomposition methods nonlinear optimization large-scale problems computational intelligence
在线阅读 下载PDF
Variable Reconstruction for Evolutionary Expensive Large-Scale Multiobjective Optimization and Its Application on Aerodynamic Design
8
作者 Jianqing Lin Cheng He +1 位作者 Ye Tian Linqiang Pan 《IEEE/CAA Journal of Automatica Sinica》 2025年第4期719-733,共15页
Expensive multiobjective optimization problems(EMOPs)are complex optimization problems exacted from realworld applications,where each objective function evaluation(FE)involves expensive computations or physical experi... Expensive multiobjective optimization problems(EMOPs)are complex optimization problems exacted from realworld applications,where each objective function evaluation(FE)involves expensive computations or physical experiments.Many surrogate-assisted evolutionary algorithms(SAEAs)have been designed to solve EMOPs.Nevertheless,EMOPs with large-scale decision variables remain challenging for existing SAEAs,leading to difficulties in maintaining convergence and diversity.To address this deficiency,we proposed a variable reconstructionbased SAEA(VREA)to balance convergence enhancement and diversity maintenance.Generally,a cluster-based variable reconstruction strategy reconstructs the original large-scale decision variables into low-dimensional weight variables.Thus,the population can be rapidly pushed towards the Pareto set(PS)by optimizing low-dimensional weight variables with the assistance of surrogate models.Population diversity is improved due to the cluster-based variable reconstruction strategy.An adaptive search step size strategy is proposed to balance exploration and exploitation further.Experimental comparisons with four state-of-the-art SAEAs are conducted on benchmark EMOPs with up to 1000 decision variables and an aerodynamic design task.Experimental results demonstrate that VREA obtains well-converged and diverse solutions with limited real FEs. 展开更多
关键词 Aerodynamic design large-scale optimization multiobjective evolutionary algorithm surrogate model variable reconstruction
在线阅读 下载PDF
Optimization design of launch window for large-scale constellation using improved genetic algorithm
9
作者 LIU Yue HOU Xiangzhen +3 位作者 CAI Xi LI Minghu CHANG Xinya WANG Miao 《先进小卫星技术(中英文)》 2025年第4期23-32,共10页
The research on optimization methods for constellation launch deployment strategies focused on the consideration of mission interval time constraints at the launch site.Firstly,a dynamic modeling of the constellation ... The research on optimization methods for constellation launch deployment strategies focused on the consideration of mission interval time constraints at the launch site.Firstly,a dynamic modeling of the constellation deployment process was established,and the relationship between the deployment window and the phase difference of the orbit insertion point,as well as the cost of phase adjustment after orbit insertion,was derived.Then,the combination of the constellation deployment position sequence was treated as a parameter,together with the sequence of satellite deployment intervals,as optimization variables,simplifying a highdimensional search problem within a wide range of dates to a finite-dimensional integer programming problem.An improved genetic algorithm with local search on deployment dates was introduced to optimize the launch deployment strategy.With the new description of the optimization variables,the total number of elements in the solution space was reduced by N orders of magnitude.Numerical simulation confirms that the proposed optimization method accelerates the convergence speed from hours to minutes. 展开更多
关键词 deployment strategy optimization launching schedule constraints improved genetic algorithm large-scale constellation
在线阅读 下载PDF
Sparse optimization of planar radio antenna arrays using a genetic algorithm
10
作者 Jiarui Di Liang Dong Wei He 《Astronomical Techniques and Instruments》 2025年第2期100-110,共11页
Radio antenna arrays have many advantages for astronomical observations,such as high resolution,high sensitivity,multi-target simultaneous observation,and flexible beam formation.Problems surrounding key indices,such ... Radio antenna arrays have many advantages for astronomical observations,such as high resolution,high sensitivity,multi-target simultaneous observation,and flexible beam formation.Problems surrounding key indices,such as sensitivity enhancement,scanning range extension,and sidelobe level suppression,need to be solved urgently.Here,we propose a sparse optimization scheme based on a genetic algorithm for a 64-array element planar radio antenna array.As optimization targets for the iterative process of the genetic algorithm,we use the maximum sidelobe levels and beamwidth of multiple cross-section patterns that pass through the main beam in three-dimensions,with the maximum sidelobe levels of the patterns at several different scanning angles.Element positions are adjusted for iterations,to select the optimal array configuration.Following sparse layout optimization,the simulated 64-element planar radio antenna array shows that the maximum sidelobe level decreases by 1.79 dB,and the beamwidth narrows by 3°.Within the scan range of±30°,after sparse array optimization,all sidelobe levels decrease,and all beamwidths narrow.This performance improvement can potentially enhance the sensitivity and spatial resolution of radio telescope systems. 展开更多
关键词 Planar antenna array sparse optimization Genetic algorithm Wide-angle scanning
在线阅读 下载PDF
Exploring Optimization Strategies for Island Power Grid Line Layout Oriented Towards Large-Scale Distributed Renewable Energy Integration
11
作者 Zhenhuan Song Wenxin Liu 《Proceedings of Business and Economic Studies》 2025年第4期495-502,共8页
The construction of island power grids is a systematic engineering task.To ensure the safe operation of power grid systems,optimizing the line layout of island power grids is crucial.Especially in the current context ... The construction of island power grids is a systematic engineering task.To ensure the safe operation of power grid systems,optimizing the line layout of island power grids is crucial.Especially in the current context of large-scale distributed renewable energy integration into the power grid,conventional island power grid line layouts can no longer meet actual demands.It is necessary to combine the operational characteristics of island power systems and historical load data to perform load forecasting,thereby generating power grid line layout paths.This article focuses on large-scale distributed renewable energy integration,summarizing optimization strategies for island power grid line layouts,and providing a solid guarantee for the safe and stable operation of island power systems. 展开更多
关键词 Island power grid Line layout optimization strategy Distributed renewable energy large-scale
在线阅读 下载PDF
Reinforcement Learning Assisted Autonomous Selection of Sparsity-Aware Genetic Operators for Sparse Large-Scale Multi-Objective Optimization
12
作者 Panpan Zhang Lintong Wang +3 位作者 Jing Rong Shuai Shao Xingyi Zhang Ye Tian 《Tsinghua Science and Technology》 2026年第1期379-398,共20页
Sparse Large-scale Multi-objective Optimization Problems(sparse LMOPs)widely exist in various optimization applications,such as neural network training,portfolio optimization,and feature selection of classification.Al... Sparse Large-scale Multi-objective Optimization Problems(sparse LMOPs)widely exist in various optimization applications,such as neural network training,portfolio optimization,and feature selection of classification.Although numerous methods exist,automatically selecting efficient solving strategies for sparse LMOPs remains highly challenging.Given this,we propose a reinforcement learning assisted autonomous sparse multi-objective evolutionary algorithm,which aims to effectively utilize sparse knowledge for designing diversified genetic operators,and automatically select appropriate genetic operators for various problems or different situations within the same optimization process.Specifically,three sparsity-aware genetic operators are designed by utilizing sparsity statistic,sparsity clustering,and sparsity logic operation.They possess distinct advantages in terms of convergence speed,solution quality,and diversity.Furthermore,the utilization of deep Q-network enables the automatic selection of suitable operators for offspring reproduction based on the current sparse state of the population.The proposed algorithm is compared with five state-of-the-art algorithms on eight benchmark and three real-world problems.Experimental results demonstrate the superiority of the proposed algorithm and the effectiveness of the proposed sparse genetic operators for solving sparse LMOPs. 展开更多
关键词 large-scale multi-objective optimization sparse reinforcement learning autonomous selection sparsity-aware genetic operators
原文传递
A Generative Adversarial Network Guided Evolutionary Algorithm for Large-scale Sparse Multiobjective Optimization
13
作者 Zhuanlian Ding Junzhe Liu +2 位作者 Dengdi Sun Xingyi Zhang Bin Luo 《Machine Intelligence Research》 2026年第1期263-280,共18页
In recent decades,great progress has been made in learnable multiobjective evolutionary algorithms(MOEAs)in the field of evolutionary computations.However,existing learnable MOEAs have not been equipped with powerful ... In recent decades,great progress has been made in learnable multiobjective evolutionary algorithms(MOEAs)in the field of evolutionary computations.However,existing learnable MOEAs have not been equipped with powerful strategies for addressing the grand series associated with sparse large-scale multiobjective optimization problems(sparse LSMOPs),which include the curse of dimensionality and unknown sparsity characteristics.This work proposes a generative adversarial network(GAN)-guided evolutionary algorithm for solving sparse LSMOPs.GAN-aided offspring generation is adopted at each generation to generate high-quality sparse offspring solutions to improve the search performance,owing to the GAN’s powerful learning and generative capabilities.Specifically,random interpolation and discretization strategies are utilized to prevent mode collapse and falling into local optima,thereby generating promising sparse offspring solutions.The experimental results on both benchmark and real-world problems verify the superior performance of the proposed algorithm compared with the state-of-the-art evolutionary algorithms. 展开更多
关键词 Evolutionary algorithm generative adversarial network(GAN) sparse large-scale multiobjective optimization
原文传递
Periodical sparse-assisted decoupling method for local fault detection of spiral bevel gears
14
作者 Keyuan LI Yanan WANG +2 位作者 Baijie QIAO Zhibin ZHAO Xuefeng CHEN 《Chinese Journal of Aeronautics》 2026年第1期349-369,共21页
Early fault detection for spiral bevel gears is crucial to ensure normal operation and prevent accidents.The harmonic components,excited by the time-varying mesh stiffness,always appear in measured vibration signal.Ho... Early fault detection for spiral bevel gears is crucial to ensure normal operation and prevent accidents.The harmonic components,excited by the time-varying mesh stiffness,always appear in measured vibration signal.How to extract the periodical impulses that indicate gear localized fault buried in the intensive noise and interfered by harmonics is a challenging task.In this paper,a novel Periodical Sparse-Assisted Decoupling(PSAD)method is proposed as an optimization problem to extract fault feature from noisy vibration signal.The PSAD method decouples the impulsive fault feature and harmonic components based on the sparse representation method.The sparsity within and across groups property and the periodicity of the fault feature are incorporated into the regularizer as the prior information.The nonconvex penalty is employed to highlight the sparsity of fault features.Meanwhile,the weight factor based on2norm of each group is constructed to strengthen the amplitude of fault feature.An iterative algorithm with Majorization-Minimization(MM)is derived to solve the optimization problem.Simulation study and experimental analysis confirm the performance of the proposed PSAD method in extracting and enhancing defect impulses from noisy signal.The suggested method surpasses other comparative methods in extracting and enhancing fault features. 展开更多
关键词 Fault detection Nonconvex optimization sparse decoupling sparsity within and across groups Spiral bevel gear
原文传递
Integrating Conjugate Gradients Into Evolutionary Algorithms for Large-Scale Continuous Multi-Objective Optimization 被引量:7
15
作者 Ye Tian Haowen Chen +3 位作者 Haiping Ma Xingyi Zhang Kay Chen Tan Yaochu Jin 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第10期1801-1817,共17页
Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms a... Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms are good at solving small-scale multi-objective optimization problems,they are criticized for low efficiency in converging to the optimums of LSMOPs.By contrast,mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems,but they have difficulties in finding diverse solutions for LSMOPs.Currently,how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored.In this paper,a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method.On the one hand,conjugate gradients and differential evolution are used to update different decision variables of a set of solutions,where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front.On the other hand,objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions,and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent.In comparison with state-of-the-art evolutionary algorithms,mathematical programming methods,and hybrid algorithms,the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs. 展开更多
关键词 Conjugate gradient differential evolution evolutionary computation large-scale multi-objective optimization mathematical programming
在线阅读 下载PDF
Evolutionary Computation for Large-scale Multi-objective Optimization: A Decade of Progresses 被引量:6
16
作者 Wen-Jing Hong Peng Yang Ke Tang 《International Journal of Automation and computing》 EI CSCD 2021年第2期155-169,共15页
Large-scale multi-objective optimization problems(MOPs)that involve a large number of decision variables,have emerged from many real-world applications.While evolutionary algorithms(EAs)have been widely acknowledged a... Large-scale multi-objective optimization problems(MOPs)that involve a large number of decision variables,have emerged from many real-world applications.While evolutionary algorithms(EAs)have been widely acknowledged as a mainstream method for MOPs,most research progress and successful applications of EAs have been restricted to MOPs with small-scale decision variables.More recently,it has been reported that traditional multi-objective EAs(MOEAs)suffer severe deterioration with the increase of decision variables.As a result,and motivated by the emergence of real-world large-scale MOPs,investigation of MOEAs in this aspect has attracted much more attention in the past decade.This paper reviews the progress of evolutionary computation for large-scale multi-objective optimization from two angles.From the key difficulties of the large-scale MOPs,the scalability analysis is discussed by focusing on the performance of existing MOEAs and the challenges induced by the increase of the number of decision variables.From the perspective of methodology,the large-scale MOEAs are categorized into three classes and introduced respectively:divide and conquer based,dimensionality reduction based and enhanced search-based approaches.Several future research directions are also discussed. 展开更多
关键词 large-scale multi-objective optimization high-dimensional search space evolutionary computation evolutionary algorithms SCALABILITY
原文传递
Modified Augmented Lagrange Multiplier Methods for Large-Scale Chemical Process Optimization 被引量:6
17
作者 梁昔明 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2001年第2期167-172,共6页
Chemical process optimization can be described as large-scale nonlinear constrained minimization. The modified augmented Lagrange multiplier methods (MALMM) for large-scale nonlinear constrained minimization are studi... Chemical process optimization can be described as large-scale nonlinear constrained minimization. The modified augmented Lagrange multiplier methods (MALMM) for large-scale nonlinear constrained minimization are studied in this paper. The Lagrange function contains the penalty terms on equality and inequality constraints and the methods can be applied to solve a series of bound constrained sub-problems instead of a series of unconstrained sub-problems. The steps of the methods are examined in full detail. Numerical experiments are made for a variety of problems, from small to very large-scale, which show the stability and effectiveness of the methods in large-scale problems. 展开更多
关键词 modified augmented Lagrange multiplier methods chemical engineering optimization large-scale non- linear constrained minimization numerical experiment
在线阅读 下载PDF
Performance Analysis of Sparse Array based Massive MIMO via Joint Convex Optimization 被引量:2
18
作者 Mengting Lou Jing Jin +5 位作者 Hanning Wang Dan Wu Liang Xia Qixing Wang Yifei Yuan Jiangzhou Wang 《China Communications》 SCIE CSCD 2022年第3期88-100,共13页
Massive multiple-input multiple-output(MIMO)technology enables higher data rate transmission in the future mobile communications.However,exploiting a large number of antenna elements at base station(BS)makes effective... Massive multiple-input multiple-output(MIMO)technology enables higher data rate transmission in the future mobile communications.However,exploiting a large number of antenna elements at base station(BS)makes effective implementation of massive MIMO challenging,due to the size and weight limits of the masssive MIMO that are located on each BS.Therefore,in order to miniaturize the massive MIMO,it is crucial to reduce the number of antenna elements via effective methods such as sparse array synthesis.In this paper,a multiple-pattern synthesis is considered towards convex optimization(CO).The joint convex optimization(JCO)based synthesis is proposed to construct a codebook for beamforming.Then,a criterion containing multiple constraints is developed,in which the sparse array is required to fullfill all constraints.Finally,extensive evaluations are performed under realistic simulation settings.The results show that with the same number of antenna elements,sparse array using the proposed JCO-based synthesis outperforms not only the uniform array,but also the sparse array with the existing CO-based synthesis method.Furthermore,with a half of the number of antenna elements that on the uniform array,the performance of the JCO-based sparse array approaches to that of the uniform array. 展开更多
关键词 B5G 6G sparse array joint convex optimization massive MIMO system-level simulation
在线阅读 下载PDF
Enhanced Butterfly Optimization Algorithm for Large-Scale Optimization Problems 被引量:1
19
作者 Yu Li Xiaomei Yu Jingsen Liu 《Journal of Bionic Engineering》 SCIE EI CSCD 2022年第2期554-570,共17页
To solve large-scale optimization problems,Fragrance coefficient and variant Particle Swarm local search Butterfly Optimization Algorithm(FPSBOA)is proposed.In the position update stage of Butterfly Optimization Algor... To solve large-scale optimization problems,Fragrance coefficient and variant Particle Swarm local search Butterfly Optimization Algorithm(FPSBOA)is proposed.In the position update stage of Butterfly Optimization Algorithm(BOA),the fragrance coefficient is designed to balance the exploration and exploitation of BOA.The variant particle swarm local search strategy is proposed to improve the local search ability of the current optimal butterfly and prevent the algorithm from falling into local optimality.192000-dimensional functions and 201000-dimensional CEC 2010 large-scale functions are used to verify FPSBOA for complex large-scale optimization problems.The experimental results are statistically analyzed by Friedman test and Wilcoxon rank-sum test.All attained results demonstrated that FPSBOA can better solve more challenging scientific and industrial real-world problems with thousands of variables.Finally,four mechanical engineering problems and one ten-dimensional process synthesis and design problem are applied to FPSBOA,which shows FPSBOA has the feasibility and effectiveness in real-world application problems. 展开更多
关键词 Butterfy optimization algorithm Fragrance coefcient Variant particle swarm local search large-scale optimization problems Real-world application problems
在线阅读 下载PDF
Gregory Solid Construction for Polyhedral Volume Parameterization by Sparse Optimization
20
作者 HU Chuan-feng LIN Hong-wei 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2019年第3期340-355,共16页
In isogeometric analysis,it is frequently required to handle the geometric models enclosed by four-sided or non-four-sided boundary patches,such as trimmed surfaces.In this paper,we develop a Gregory solid based metho... In isogeometric analysis,it is frequently required to handle the geometric models enclosed by four-sided or non-four-sided boundary patches,such as trimmed surfaces.In this paper,we develop a Gregory solid based method to parameterize those models.First,we extend the Gregory patch representation to the trivariate Gregory solid representation.Second,the trivariate Gregory solid representation is employed to interpolate the boundary patches of a geometric model,thus generating the polyhedral volume parametrization.To improve the regularity of the polyhedral volume parametrization,we formulate the construction of the trivariate Gregory solid as a sparse optimization problem,where the optimization objective function is a linear combination of some terms,including a sparse term aiming to reduce the negative Jacobian area of the Gregory solid.Then,the alternating direction method of multipliers(ADMM)is used to solve the sparse optimization problem.Lots of experimental examples illustrated in this paper demonstrate the effectiveness and efficiency of the developed method. 展开更多
关键词 Gregory solid POLYHEDRAL volume PARAMETRIZATION sparse optimization REGULARITY Isogeometric analysis
在线阅读 下载PDF
上一页 1 2 39 下一页 到第
使用帮助 返回顶部