Optimization is the key to obtaining efficient utilization of resources in structural design.Due to the complex nature of truss systems,this study presents a method based on metaheuristic modelling that minimises stru...Optimization is the key to obtaining efficient utilization of resources in structural design.Due to the complex nature of truss systems,this study presents a method based on metaheuristic modelling that minimises structural weight under stress and frequency constraints.Two new algorithms,the Red Kite Optimization Algorithm(ROA)and Secretary Bird Optimization Algorithm(SBOA),are utilized on five benchmark trusses with 10,18,37,72,and 200-bar trusses.Both algorithms are evaluated against benchmarks in the literature.The results indicate that SBOA always reaches a lighter optimal.Designs with reducing structural weight ranging from 0.02%to 0.15%compared to ROA,and up to 6%–8%as compared to conventional algorithms.In addition,SBOA can achieve 15%–20%faster convergence speed and 10%–18%reduction in computational time with a smaller standard deviation over independent runs,which demonstrates its robustness and reliability.It is indicated that the adaptive exploration mechanism of SBOA,especially its Levy flight–based search strategy,can obviously improve optimization performance for low-and high-dimensional trusses.The research has implications in the context of promoting bio-inspired optimization techniques by demonstrating the viability of SBOA,a reliable model for large-scale structural design that provides significant enhancements in performance and convergence behavior.展开更多
Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a ...Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a Convolutional Neural Network(CNN)with a Bidirectional Long Short-Term Memory(BiLSTM)architecture,optimized using the Firefly Optimization algorithm(FO).The proposed CNN-BiLSTM-FO model is tailored for structured biomedical data,capturing both local patterns and sequential dependencies in diagnostic features,while the Firefly Algorithm fine-tunes key hyperparameters to maximize predictive performance.The approach is evaluated on two benchmark biomedical datasets:one comprising diagnostic data for bone cancer detection and another for identifying marrow cell abnormalities.Experimental results demonstrate that the proposed method outperforms standard deep learning models,including CNN,LSTM,BiLSTM,and CNN-LSTM hybrids,significantly.The CNNBiLSTM-FO model achieves an accuracy of 98.55%for bone cancer detection and 96.04%for marrow abnormality classification.The paper also presents a detailed complexity analysis of the proposed algorithm and compares its performance across multiple evaluation metrics such as precision,recall,F1-score,and AUC.The results confirm the effectiveness of the firefly-based optimization strategy in improving classification accuracy and model robustness.This work introduces a scalable and accurate diagnostic solution that holds strong potential for integration into intelligent clinical decision-support systems.展开更多
Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality ...Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality solutions efficiently by balancing exploration of the search space and exploitation of promising solutions.While heuristic optimization algorithms vary in their specific details,they often exhibit common patterns that are essential to their effectiveness.This paper aims to analyze and explore common patterns in heuristic optimization algorithms.Through a comprehensive review of the literature,we identify the patterns that are commonly observed in these algorithms,including initialization,local search,diversity maintenance,adaptation,and stochasticity.For each pattern,we describe the motivation behind it,its implementation,and its impact on the search process.To demonstrate the utility of our analysis,we identify these patterns in multiple heuristic optimization algorithms.For each case study,we analyze how the patterns are implemented in the algorithm and how they contribute to its performance.Through these case studies,we show how our analysis can be used to understand the behavior of heuristic optimization algorithms and guide the design of new algorithms.Our analysis reveals that patterns in heuristic optimization algorithms are essential to their effectiveness.By understanding and incorporating these patterns into the design of new algorithms,researchers can develop more efficient and effective optimization algorithms.展开更多
Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing h...Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.展开更多
Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in comm...Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in communication and its potential applications in various areas such as network optimization,signal processing,and machine learning for communication.First,the basic principle of quantum computing,quantum physics systems,and quantum algorithms are analyzed.Then,based on the classification of quantum algorithms,several important basic quantum algorithms,quantum optimization algorithms,and quantum machine learning algorithms are discussed in detail.Finally,the basic ideas and feasibility of introducing quantum algorithms into communications are emphatically analyzed,which provides a reference to address computational bottlenecks in communication networks.展开更多
A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANE...A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANET security,network longevity,and energy efficiency.Hence,selecting an appropriate cluster.The cluster’s head further boosts the network’s energy effectiveness.As a result,a Hybrid Swallow Swarm Optimisation-Memetic Algorithm(SSO-MA)is suggested to develop the energy efficiency&of the MANET network.Then,to secure the network Abnormality Detection System(ADS)is proposed.The MATLAB-2021a platform is used to implement the suggested technique and conduct the analysis.In terms of network performance,the suggested model outperforms the current Genetic Algorithm,Optimised Link State Routing protocol,and Particle Swarm Optimisation techniques.The performance of the model has a minimum delay in the range of 0.82 seconds and a Packet Delivery Ratio(PDR)of 99.82%.Hence,the validation shows that the Hybrid SSO-MA strategy is superior to the other approaches in terms of efficiency.展开更多
The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting i...The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting is the foundation of conventional pain assessment methods,which may be unreliable.Deep learning is a promising alternative to resolve this limitation through automated pain classification.This paper proposes an ensemble deep-learning framework for pain assessment.The framework makes use of features collected from electromyography(EMG),skin conductance level(SCL),and electrocardiography(ECG)signals.We integrate Convolutional Neural Networks(CNN),Long Short-Term Memory Networks(LSTM),Bidirectional Gated Recurrent Units(BiGRU),and Deep Neural Networks(DNN)models.We then aggregate their predictions using a weighted averaging ensemble technique to increase the classification’s robustness.To improve computing efficiency and remove redundant features,we use Particle Swarm Optimization(PSO)for feature selection.This enables us to reduce the features’dimensionality without sacrificing the classification’s accuracy.With improved accuracy,precision,recall,and F1-score across all pain levels,the experimental results show that the suggested ensemble model performs better than individual deep learning classifiers.In our experiments,the suggested model achieved over 98%accuracy,suggesting promising automated pain assessment performance.However,due to differences in validation protocols,comparisons with previous studies are still limited.Combining deep learning and feature selection techniques significantly improves model generalization,reducing overfitting and enhancing classification performance.The evaluation was conducted using the BioVid Heat Pain Dataset,confirming the model’s effectiveness in distinguishing between different pain intensity levels.展开更多
Deep neural networks are increasingly exposed to attack threats,and at the same time,the need for privacy protection is growing.As a result,the challenge of developing neural networks that are both robust and capable ...Deep neural networks are increasingly exposed to attack threats,and at the same time,the need for privacy protection is growing.As a result,the challenge of developing neural networks that are both robust and capable of strong generalization while maintaining privacy becomes pressing.Training neural networks under privacy constraints is one way to minimize privacy leakage,and one way to do this is to add noise to the data or model.However,noise may cause gradient directions to deviate from the optimal trajectory during training,leading to unstable parameter updates,slow convergence,and reduced model generalization capability.To overcome these challenges,we propose an optimization algorithm based on double-integral coevolutionary neurodynamics(DICND),designed to accelerate convergence and improve generalization in noisy conditions.Theoretical analysis proves the global convergence of the DICND algorithm and demonstrates its ability to converge to near-global minima efficiently under noisy conditions.Numerical simulations and image classification experiments further confirm the DICND algorithm's significant advantages in enhancing generalization performance.展开更多
Accurately forecasting peak particle velocity(PPV)during blasting operations plays a crucial role in mitigating vibration-related hazards and preventing economic losses.This research introduces an approach to PPV pred...Accurately forecasting peak particle velocity(PPV)during blasting operations plays a crucial role in mitigating vibration-related hazards and preventing economic losses.This research introduces an approach to PPV prediction by combining conventional empirical equations with physics-informed neural networks(PINN)and optimizing the model parameters via the Particle Swarm Optimization(PSO)algorithm.The proposed PSO-PINN framework was rigorously benchmarked against seven established machine learning approaches:Multilayer Perceptron(MLP),Extreme Gradient Boosting(XGBoost),Random Forest(RF),Support Vector Regression(SVR),Gradient Boosting Decision Tree(GBDT),Adaptive Boosting(Adaboost),and Gene Expression Programming(GEP).Comparative analysis showed that PSO-PINN outperformed these models,achieving RMSE reductions of 17.82-37.63%,MSE reductions of 32.47-61.10%,AR improvements of 2.97-21.19%,and R^(2)enhancements of 7.43-29.21%,demonstrating superior accuracy and generalization.Furthermore,the study determines the impact of incorporating empirical formulas as physical constraints in neural networks and examines the effects of different empirical equations,particle swarm size,iteration count in PSO,regularization coefficient,and learning rate in PINN on model performance.Lastly,a predictive system for blast vibration PPV is designed and implemented.The research outcomes offer theoretical references and practical recommendations for blast vibration forecasting in similar engineering applications.展开更多
The optimization of reaction processes is crucial for the green, efficient, and sustainable development of the chemical industry. However, how to address the problems posed by multiple variables, nonlinearities, and u...The optimization of reaction processes is crucial for the green, efficient, and sustainable development of the chemical industry. However, how to address the problems posed by multiple variables, nonlinearities, and uncertainties during optimization remains a formidable challenge. In this study, a strategy combining interpretable machine learning with metaheuristic optimization algorithms is employed to optimize the reaction process. First, experimental data from a biodiesel production process are collected to establish a database. These data are then used to construct a predictive model based on artificial neural network (ANN) models. Subsequently, interpretable machine learning techniques are applied for quantitative analysis and verification of the model. Finally, four metaheuristic optimization algorithms are coupled with the ANN model to achieve the desired optimization. The research results show that the methanol: palm fatty acid distillate (PFAD) molar ratio contributes the most to the reaction outcome, accounting for 41%. The ANN-simulated annealing (SA) hybrid method is more suitable for this optimization, and the optimal process parameters are a catalyst concentration of 3.00% (mass), a methanol: PFAD molar ratio of 8.67, and a reaction time of 30 min. This study provides deeper insights into reaction process optimization, which will facilitate future applications in various reaction optimization processes.展开更多
Power prediction has been critical in large-scale wind power grid connections.However,traditional wind power prediction methods have long suffered from problems,for instance low prediction accuracy and poor reliabilit...Power prediction has been critical in large-scale wind power grid connections.However,traditional wind power prediction methods have long suffered from problems,for instance low prediction accuracy and poor reliability.For this purpose,a hybrid prediction model(VMD-LSTM-Attention)has been proposed,which integrates the variational modal decomposition(VMD),the long short-term memory(LSTM),and the attention mechanism(Attention),and has been optimized by improved dung beetle optimization algorithm(IDBO).Firstly,the algorithm's performance has been significantly enhanced through the implementation of three key strategies,namely the elite group strategy of the Logistic-Tent map,the nonlinear adjustment factor,and the adaptive T-distribution disturbance mechanism.Subsequently,IDBO has been applied to optimize the important parameters of VMD(decomposition layers and penalty factors)to ensure the best decomposition signal is obtained;Furthermore,the IDBO has been deployed to optimize the three key hyper-parameters of the LSTM,thereby improving its learning capability.Finally,an Attention mechanism has been incorporated to adaptively weight temporal features,thus increasing the model's ability to focus on key information.Comprehensive simulation experiments have demonstrated that the proposed model achieves higher prediction accuracy compared with VMD-LSTM,VMD-LSTM-Attention,and traditional prediction methods,and quantitative indexes verify the efectiveness of the algorithmic improvement as well as the excellence and precision of the model in wind power prediction.展开更多
The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardio...The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardiovascular disease(CVD)diagnosis,but fluctuating signal patterns make classification challenging.Computer-assisted automated diagnostic tools that enhance ECG signal categorization using sophisticated algorithms and machine learning are helping healthcare practitioners manage greater patient populations.With this motivation,the study proposes a DL framework leveraging the PTB-XL ECG dataset to improve CVD diagnosis.Deep Transfer Learning(DTL)techniques extract features,followed by feature fusion to eliminate redundancy and retain the most informative features.Utilizing the African Vulture Optimization Algorithm(AVOA)for feature selection is more effective than the standard methods,as it offers an ideal balance between exploration and exploitation that results in an optimal set of features,improving classification performance while reducing redundancy.Various machine learning classifiers,including Support Vector Machine(SVM),eXtreme Gradient Boosting(XGBoost),Adaptive Boosting(AdaBoost),and Extreme Learning Machine(ELM),are used for further classification.Additionally,an ensemble model is developed to further improve accuracy.Experimental results demonstrate that the proposed model achieves the highest accuracy of 96.31%,highlighting its effectiveness in enhancing CVD diagnosis.展开更多
Efficient warehouse management is critical for modern supply chain systems,particularly in the era of e-commerce and automation.The Multi-Picker Robot Routing Problem(MPRRP)presents a complex challenge involving the o...Efficient warehouse management is critical for modern supply chain systems,particularly in the era of e-commerce and automation.The Multi-Picker Robot Routing Problem(MPRRP)presents a complex challenge involving the optimization of routes for multiple robots assigned to retrieve items from distinct locations within a warehouse.This study introduces optimized metaheuristic strategies to address MPRRP,with the aim of minimizing travel distances,energy consumption,and order fulfillment time while ensuring operational efficiency.Advanced algorithms,including an enhanced Particle Swarm Optimization(PSO-MPRRP)and a tailored Genetic Algorithm(GA-MPRRP),are specifically designed with customized evolutionary operators to effectively solve the MPRRP.Comparative experiments are conducted to evaluate the proposed strategies against benchmark approaches,demonstrating significant improvements in solution quality and computational efficiency.The findings contribute to the development of intelligent,scalable,and environmentally friendly warehouse systems,paving the way for future advances in robotics and automated logistics management.展开更多
Aiming to solve the steering instability and hysteresis of agricultural robots in the process of movement,a fusion PID control method of particle swarm optimization(PSO)and genetic algorithm(GA)was proposed.The fusion...Aiming to solve the steering instability and hysteresis of agricultural robots in the process of movement,a fusion PID control method of particle swarm optimization(PSO)and genetic algorithm(GA)was proposed.The fusion algorithm took advantage of the fast optimization ability of PSO to optimize the population screening link of GA.The Simulink simulation results showed that the convergence of the fitness function of the fusion algorithm was accelerated,the system response adjustment time was reduced,and the overshoot was almost zero.Then the algorithm was applied to the steering test of agricultural robot in various scenes.After modeling the steering system of agricultural robot,the steering test results in the unloaded suspended state showed that the PID control based on fusion algorithm reduced the rise time,response adjustment time and overshoot of the system,and improved the response speed and stability of the system,compared with the artificial trial and error PID control and the PID control based on GA.The actual road steering test results showed that the PID control response rise time based on the fusion algorithm was the shortest,about 4.43 s.When the target pulse number was set to 100,the actual mean value in the steady-state regulation stage was about 102.9,which was the closest to the target value among the three control methods,and the overshoot was reduced at the same time.The steering test results under various scene states showed that the PID control based on the proposed fusion algorithm had good anti-interference ability,it can adapt to the changes of environment and load and improve the performance of the control system.It was effective in the steering control of agricultural robot.This method can provide a reference for the precise steering control of other robots.展开更多
Optimization problems are prevalent in various fields of science and engineering,with several real-world applications characterized by high dimensionality and complex search landscapes.Starfish optimization algorithm(...Optimization problems are prevalent in various fields of science and engineering,with several real-world applications characterized by high dimensionality and complex search landscapes.Starfish optimization algorithm(SFOA)is a recently optimizer inspired by swarm intelligence,which is effective for numerical optimization,but it may encounter premature and local convergence for complex optimization problems.To address these challenges,this paper proposes the multi-strategy enhanced crested porcupine-starfish optimization algorithm(MCPSFOA).The core innovation of MCPSFOA lies in employing a hybrid strategy to improve SFOA,which integrates the exploratory mechanisms of SFOA with the diverse search capacity of the Crested Porcupine Optimizer(CPO).This synergy enhances MCPSFOA’s ability to navigate complex and multimodal search spaces.To further prevent premature convergence,MCPSFOA incorporates Lévy flight,leveraging its characteristic long and short jump patterns to enable large-scale exploration and escape from local optima.Subsequently,Gaussian mutation is applied for precise solution tuning,introducing controlled perturbations that enhance accuracy and mitigate the risk of insufficient exploitation.Notably,the population diversity enhancement mechanism periodically identifies and resets stagnant individuals,thereby consistently revitalizing population variety throughout the optimization process.MCPSFOA is rigorously evaluated on 24 classical benchmark functions(including high-dimensional cases),the CEC2017 suite,and the CEC2022 suite.MCPSFOA achieves superior overall performance with Friedman mean ranks of 2.208,2.310 and 2.417 on these benchmark functions,outperforming 11 state-of-the-art algorithms.Furthermore,the practical applicability of MCPSFOA is confirmed through its successful application to five engineering optimization cases,where it also yields excellent results.In conclusion,MCPSFOA is not only a highly effective and reliable optimizer for benchmark functions,but also a practical tool for solving real-world optimization problems.展开更多
Multi-label feature selection(MFS)is a crucial dimensionality reduction technique aimed at identifying informative features associated with multiple labels.However,traditional centralized methods face significant chal...Multi-label feature selection(MFS)is a crucial dimensionality reduction technique aimed at identifying informative features associated with multiple labels.However,traditional centralized methods face significant challenges in privacy-sensitive and distributed settings,often neglecting label dependencies and suffering from low computational efficiency.To address these issues,we introduce a novel framework,Fed-MFSDHBCPSO—federated MFS via dual-layer hybrid breeding cooperative particle swarm optimization algorithm with manifold and sparsity regularization(DHBCPSO-MSR).Leveraging the federated learning paradigm,Fed-MFSDHBCPSO allows clients to perform local feature selection(FS)using DHBCPSO-MSR.Locally selected feature subsets are encrypted with differential privacy(DP)and transmitted to a central server,where they are securely aggregated and refined through secure multi-party computation(SMPC)until global convergence is achieved.Within each client,DHBCPSO-MSR employs a dual-layer FS strategy.The inner layer constructs sample and label similarity graphs,generates Laplacian matrices to capture the manifold structure between samples and labels,and applies L2,1-norm regularization to sparsify the feature subset,yielding an optimized feature weight matrix.The outer layer uses a hybrid breeding cooperative particle swarm optimization algorithm to further refine the feature weight matrix and identify the optimal feature subset.The updated weight matrix is then fed back to the inner layer for further optimization.Comprehensive experiments on multiple real-world multi-label datasets demonstrate that Fed-MFSDHBCPSO consistently outperforms both centralized and federated baseline methods across several key evaluation metrics.展开更多
Existing feature selection methods for intrusion detection systems in the Industrial Internet of Things often suffer from local optimality and high computational complexity.These challenges hinder traditional IDS from...Existing feature selection methods for intrusion detection systems in the Industrial Internet of Things often suffer from local optimality and high computational complexity.These challenges hinder traditional IDS from effectively extracting features while maintaining detection accuracy.This paper proposes an industrial Internet ofThings intrusion detection feature selection algorithm based on an improved whale optimization algorithm(GSLDWOA).The aim is to address the problems that feature selection algorithms under high-dimensional data are prone to,such as local optimality,long detection time,and reduced accuracy.First,the initial population’s diversity is increased using the Gaussian Mutation mechanism.Then,Non-linear Shrinking Factor balances global exploration and local development,avoiding premature convergence.Lastly,Variable-step Levy Flight operator and Dynamic Differential Evolution strategy are introduced to improve the algorithm’s search efficiency and convergence accuracy in highdimensional feature space.Experiments on the NSL-KDD and WUSTL-IIoT-2021 datasets demonstrate that the feature subset selected by GSLDWOA significantly improves detection performance.Compared to the traditional WOA algorithm,the detection rate and F1-score increased by 3.68%and 4.12%.On the WUSTL-IIoT-2021 dataset,accuracy,recall,and F1-score all exceed 99.9%.展开更多
With the increasing number of geosynchronous orbit satellites with expiring lifetime,spacecraft refueling is crucial in enhancing the economic benefits of on-orbit services.The existing studies tend to be based on pre...With the increasing number of geosynchronous orbit satellites with expiring lifetime,spacecraft refueling is crucial in enhancing the economic benefits of on-orbit services.The existing studies tend to be based on predetermined refueling duration;however,the precise mission scheduling solution will be difficult to apply due to uncertain refueling duration caused by orbital transfer deviations and stochastic actuator faults during actual on-orbit service.Therefore,this paper proposes a robust mission scheduling strategy for geosynchronous orbit spacecraft on-orbit refueling missions with uncertain refueling duration.Firstly,a robust mission scheduling model is constructed by introducing the budget uncertainty set to describe the uncertain refueling duration.Secondly,a hybrid harris hawks optimization algorithm is designed to explore the optimal mission allocation and refueling sequences,which combines cubic chaotic mapping to initialize the population,and the crossover in the genetic algorithm is introduced to enhance global convergence.Finally,the typical simulation examples are constructed with real-mission scenarios in three aspects to analyze:performance comparisons with various algorithms;robustness analyses via comparisons of different on-orbit refueling durations;investigations into the impacts of different initial population strategies on algorithm performance,demonstrating the proposed mission scheduling framework's robustness and effectiveness by comparing it with the exact mission scheduling.展开更多
Amidst the growing global emphasis on nuclear safety,the integrity of nuclear reactor systems has garnered attention in the aftermath of consequential events.Moreover,the rapid development of artificial intelligence t...Amidst the growing global emphasis on nuclear safety,the integrity of nuclear reactor systems has garnered attention in the aftermath of consequential events.Moreover,the rapid development of artificial intelligence technology has provided immense opportunities to enhance the safety and economy of nuclear energy.However,data-driven deep learning techniques often lack interpretability,which hinders their applicability in the nuclear energy sector.To address this problem,this study proposes a hybrid data-driven and knowledge-driven artificial intelligence model based on physics-informed neural networks to accurately compute the neutron flux distribution inside a nuclear reactor core.Innovative techniques,such as regional decomposition,intelligent k_(eff)(effective multiplication factor)search,and k_(eff)inversion,have been introduced for the calculation.Furthermore,hyperparameters of the model are automatically optimized using a whale optimization algorithm.A series of computational examples are used to validate the proposed model,demonstrating its applicability,generality,and high accuracy in calculating the neutron flux within the nuclear reactor.The model offers a dependable strategy for computing the neutron flux distribution in nuclear reactors for advanced simulation techniques in the future,including reactor digital twinning.This approach is data-light,requires little to no training data,and still delivers remarkably precise output data.展开更多
To make up the poor quality defects of traditional control methods and meet the growing requirements of accuracy for strip crown,an optimized model based on support vector machine(SVM)is put forward firstly to enhance...To make up the poor quality defects of traditional control methods and meet the growing requirements of accuracy for strip crown,an optimized model based on support vector machine(SVM)is put forward firstly to enhance the quality of product in hot strip rolling.Meanwhile,for enriching data information and ensuring data quality,experimental data were collected from a hot-rolled plant to set up prediction models,as well as the prediction performance of models was evaluated by calculating multiple indicators.Furthermore,the traditional SVM model and the combined prediction models with particle swarm optimization(PSO)algorithm and the principal component analysis combined with cuckoo search(PCA-CS)optimization strategies are presented to make a comparison.Besides,the prediction performance comparisons of the three models are discussed.Finally,the experimental results revealed that the PCA-CS-SVM model has the highest prediction accuracy and the fastest convergence speed.Furthermore,the root mean squared error(RMSE)of PCA-CS-SVM model is 2.04μm,and 98.15%of prediction data have an absolute error of less than 4.5μm.Especially,the results also proved that PCA-CS-SVM model not only satisfies precision requirement but also has certain guiding significance for the actual production of hot strip rolling.展开更多
文摘Optimization is the key to obtaining efficient utilization of resources in structural design.Due to the complex nature of truss systems,this study presents a method based on metaheuristic modelling that minimises structural weight under stress and frequency constraints.Two new algorithms,the Red Kite Optimization Algorithm(ROA)and Secretary Bird Optimization Algorithm(SBOA),are utilized on five benchmark trusses with 10,18,37,72,and 200-bar trusses.Both algorithms are evaluated against benchmarks in the literature.The results indicate that SBOA always reaches a lighter optimal.Designs with reducing structural weight ranging from 0.02%to 0.15%compared to ROA,and up to 6%–8%as compared to conventional algorithms.In addition,SBOA can achieve 15%–20%faster convergence speed and 10%–18%reduction in computational time with a smaller standard deviation over independent runs,which demonstrates its robustness and reliability.It is indicated that the adaptive exploration mechanism of SBOA,especially its Levy flight–based search strategy,can obviously improve optimization performance for low-and high-dimensional trusses.The research has implications in the context of promoting bio-inspired optimization techniques by demonstrating the viability of SBOA,a reliable model for large-scale structural design that provides significant enhancements in performance and convergence behavior.
文摘Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a Convolutional Neural Network(CNN)with a Bidirectional Long Short-Term Memory(BiLSTM)architecture,optimized using the Firefly Optimization algorithm(FO).The proposed CNN-BiLSTM-FO model is tailored for structured biomedical data,capturing both local patterns and sequential dependencies in diagnostic features,while the Firefly Algorithm fine-tunes key hyperparameters to maximize predictive performance.The approach is evaluated on two benchmark biomedical datasets:one comprising diagnostic data for bone cancer detection and another for identifying marrow cell abnormalities.Experimental results demonstrate that the proposed method outperforms standard deep learning models,including CNN,LSTM,BiLSTM,and CNN-LSTM hybrids,significantly.The CNNBiLSTM-FO model achieves an accuracy of 98.55%for bone cancer detection and 96.04%for marrow abnormality classification.The paper also presents a detailed complexity analysis of the proposed algorithm and compares its performance across multiple evaluation metrics such as precision,recall,F1-score,and AUC.The results confirm the effectiveness of the firefly-based optimization strategy in improving classification accuracy and model robustness.This work introduces a scalable and accurate diagnostic solution that holds strong potential for integration into intelligent clinical decision-support systems.
文摘Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality solutions efficiently by balancing exploration of the search space and exploitation of promising solutions.While heuristic optimization algorithms vary in their specific details,they often exhibit common patterns that are essential to their effectiveness.This paper aims to analyze and explore common patterns in heuristic optimization algorithms.Through a comprehensive review of the literature,we identify the patterns that are commonly observed in these algorithms,including initialization,local search,diversity maintenance,adaptation,and stochasticity.For each pattern,we describe the motivation behind it,its implementation,and its impact on the search process.To demonstrate the utility of our analysis,we identify these patterns in multiple heuristic optimization algorithms.For each case study,we analyze how the patterns are implemented in the algorithm and how they contribute to its performance.Through these case studies,we show how our analysis can be used to understand the behavior of heuristic optimization algorithms and guide the design of new algorithms.Our analysis reveals that patterns in heuristic optimization algorithms are essential to their effectiveness.By understanding and incorporating these patterns into the design of new algorithms,researchers can develop more efficient and effective optimization algorithms.
基金supported by the Research Incentive Grant 23200 of Zayed University,United Arab Emirates.
文摘Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.
文摘Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in communication and its potential applications in various areas such as network optimization,signal processing,and machine learning for communication.First,the basic principle of quantum computing,quantum physics systems,and quantum algorithms are analyzed.Then,based on the classification of quantum algorithms,several important basic quantum algorithms,quantum optimization algorithms,and quantum machine learning algorithms are discussed in detail.Finally,the basic ideas and feasibility of introducing quantum algorithms into communications are emphatically analyzed,which provides a reference to address computational bottlenecks in communication networks.
文摘A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANET security,network longevity,and energy efficiency.Hence,selecting an appropriate cluster.The cluster’s head further boosts the network’s energy effectiveness.As a result,a Hybrid Swallow Swarm Optimisation-Memetic Algorithm(SSO-MA)is suggested to develop the energy efficiency&of the MANET network.Then,to secure the network Abnormality Detection System(ADS)is proposed.The MATLAB-2021a platform is used to implement the suggested technique and conduct the analysis.In terms of network performance,the suggested model outperforms the current Genetic Algorithm,Optimised Link State Routing protocol,and Particle Swarm Optimisation techniques.The performance of the model has a minimum delay in the range of 0.82 seconds and a Packet Delivery Ratio(PDR)of 99.82%.Hence,the validation shows that the Hybrid SSO-MA strategy is superior to the other approaches in terms of efficiency.
基金funded by the Deanship of Graduate Studies and Scientific Research at Jouf University under grant No.(DGSSR-2023-02-02341).
文摘The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting is the foundation of conventional pain assessment methods,which may be unreliable.Deep learning is a promising alternative to resolve this limitation through automated pain classification.This paper proposes an ensemble deep-learning framework for pain assessment.The framework makes use of features collected from electromyography(EMG),skin conductance level(SCL),and electrocardiography(ECG)signals.We integrate Convolutional Neural Networks(CNN),Long Short-Term Memory Networks(LSTM),Bidirectional Gated Recurrent Units(BiGRU),and Deep Neural Networks(DNN)models.We then aggregate their predictions using a weighted averaging ensemble technique to increase the classification’s robustness.To improve computing efficiency and remove redundant features,we use Particle Swarm Optimization(PSO)for feature selection.This enables us to reduce the features’dimensionality without sacrificing the classification’s accuracy.With improved accuracy,precision,recall,and F1-score across all pain levels,the experimental results show that the suggested ensemble model performs better than individual deep learning classifiers.In our experiments,the suggested model achieved over 98%accuracy,suggesting promising automated pain assessment performance.However,due to differences in validation protocols,comparisons with previous studies are still limited.Combining deep learning and feature selection techniques significantly improves model generalization,reducing overfitting and enhancing classification performance.The evaluation was conducted using the BioVid Heat Pain Dataset,confirming the model’s effectiveness in distinguishing between different pain intensity levels.
基金supported by the National Natural Science Foundation of China(62394340,62394345,62473383).This work was carried out in part using computing resources at the High Performance Computing Center of Central South University。
文摘Deep neural networks are increasingly exposed to attack threats,and at the same time,the need for privacy protection is growing.As a result,the challenge of developing neural networks that are both robust and capable of strong generalization while maintaining privacy becomes pressing.Training neural networks under privacy constraints is one way to minimize privacy leakage,and one way to do this is to add noise to the data or model.However,noise may cause gradient directions to deviate from the optimal trajectory during training,leading to unstable parameter updates,slow convergence,and reduced model generalization capability.To overcome these challenges,we propose an optimization algorithm based on double-integral coevolutionary neurodynamics(DICND),designed to accelerate convergence and improve generalization in noisy conditions.Theoretical analysis proves the global convergence of the DICND algorithm and demonstrates its ability to converge to near-global minima efficiently under noisy conditions.Numerical simulations and image classification experiments further confirm the DICND algorithm's significant advantages in enhancing generalization performance.
基金supported by the National Natural Science Foundation of China(Grant No.52409143)the Basic Scientific Research Fund of Changjiang River Scientific Research Institute for Central-level Public Welfare Research Institutes(Grant No.CKSF2025184/YT)the Hubei Provincial Natural Science Foundation of China(Grant No.2022CFB673).
文摘Accurately forecasting peak particle velocity(PPV)during blasting operations plays a crucial role in mitigating vibration-related hazards and preventing economic losses.This research introduces an approach to PPV prediction by combining conventional empirical equations with physics-informed neural networks(PINN)and optimizing the model parameters via the Particle Swarm Optimization(PSO)algorithm.The proposed PSO-PINN framework was rigorously benchmarked against seven established machine learning approaches:Multilayer Perceptron(MLP),Extreme Gradient Boosting(XGBoost),Random Forest(RF),Support Vector Regression(SVR),Gradient Boosting Decision Tree(GBDT),Adaptive Boosting(Adaboost),and Gene Expression Programming(GEP).Comparative analysis showed that PSO-PINN outperformed these models,achieving RMSE reductions of 17.82-37.63%,MSE reductions of 32.47-61.10%,AR improvements of 2.97-21.19%,and R^(2)enhancements of 7.43-29.21%,demonstrating superior accuracy and generalization.Furthermore,the study determines the impact of incorporating empirical formulas as physical constraints in neural networks and examines the effects of different empirical equations,particle swarm size,iteration count in PSO,regularization coefficient,and learning rate in PINN on model performance.Lastly,a predictive system for blast vibration PPV is designed and implemented.The research outcomes offer theoretical references and practical recommendations for blast vibration forecasting in similar engineering applications.
基金supported by the National Natural Science Foundation of China(22408227,22238005)the Postdoctoral Research Foundation of China(GZC20231576).
文摘The optimization of reaction processes is crucial for the green, efficient, and sustainable development of the chemical industry. However, how to address the problems posed by multiple variables, nonlinearities, and uncertainties during optimization remains a formidable challenge. In this study, a strategy combining interpretable machine learning with metaheuristic optimization algorithms is employed to optimize the reaction process. First, experimental data from a biodiesel production process are collected to establish a database. These data are then used to construct a predictive model based on artificial neural network (ANN) models. Subsequently, interpretable machine learning techniques are applied for quantitative analysis and verification of the model. Finally, four metaheuristic optimization algorithms are coupled with the ANN model to achieve the desired optimization. The research results show that the methanol: palm fatty acid distillate (PFAD) molar ratio contributes the most to the reaction outcome, accounting for 41%. The ANN-simulated annealing (SA) hybrid method is more suitable for this optimization, and the optimal process parameters are a catalyst concentration of 3.00% (mass), a methanol: PFAD molar ratio of 8.67, and a reaction time of 30 min. This study provides deeper insights into reaction process optimization, which will facilitate future applications in various reaction optimization processes.
基金the Open Fund of Guangxi Key Laboratory of Building New Energy and Energy Saving(Project Number:Guike Energy 17-J-21-3).
文摘Power prediction has been critical in large-scale wind power grid connections.However,traditional wind power prediction methods have long suffered from problems,for instance low prediction accuracy and poor reliability.For this purpose,a hybrid prediction model(VMD-LSTM-Attention)has been proposed,which integrates the variational modal decomposition(VMD),the long short-term memory(LSTM),and the attention mechanism(Attention),and has been optimized by improved dung beetle optimization algorithm(IDBO).Firstly,the algorithm's performance has been significantly enhanced through the implementation of three key strategies,namely the elite group strategy of the Logistic-Tent map,the nonlinear adjustment factor,and the adaptive T-distribution disturbance mechanism.Subsequently,IDBO has been applied to optimize the important parameters of VMD(decomposition layers and penalty factors)to ensure the best decomposition signal is obtained;Furthermore,the IDBO has been deployed to optimize the three key hyper-parameters of the LSTM,thereby improving its learning capability.Finally,an Attention mechanism has been incorporated to adaptively weight temporal features,thus increasing the model's ability to focus on key information.Comprehensive simulation experiments have demonstrated that the proposed model achieves higher prediction accuracy compared with VMD-LSTM,VMD-LSTM-Attention,and traditional prediction methods,and quantitative indexes verify the efectiveness of the algorithmic improvement as well as the excellence and precision of the model in wind power prediction.
基金funded by Researchers Supporting ProjectNumber(RSPD2025R947),King Saud University,Riyadh,Saudi Arabia.
文摘The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardiovascular disease(CVD)diagnosis,but fluctuating signal patterns make classification challenging.Computer-assisted automated diagnostic tools that enhance ECG signal categorization using sophisticated algorithms and machine learning are helping healthcare practitioners manage greater patient populations.With this motivation,the study proposes a DL framework leveraging the PTB-XL ECG dataset to improve CVD diagnosis.Deep Transfer Learning(DTL)techniques extract features,followed by feature fusion to eliminate redundancy and retain the most informative features.Utilizing the African Vulture Optimization Algorithm(AVOA)for feature selection is more effective than the standard methods,as it offers an ideal balance between exploration and exploitation that results in an optimal set of features,improving classification performance while reducing redundancy.Various machine learning classifiers,including Support Vector Machine(SVM),eXtreme Gradient Boosting(XGBoost),Adaptive Boosting(AdaBoost),and Extreme Learning Machine(ELM),are used for further classification.Additionally,an ensemble model is developed to further improve accuracy.Experimental results demonstrate that the proposed model achieves the highest accuracy of 96.31%,highlighting its effectiveness in enhancing CVD diagnosis.
基金funded by Hanoi University of Industry,Hanoi,Vietnam,under contract number 25−2024−RD/HD−DHCN.
文摘Efficient warehouse management is critical for modern supply chain systems,particularly in the era of e-commerce and automation.The Multi-Picker Robot Routing Problem(MPRRP)presents a complex challenge involving the optimization of routes for multiple robots assigned to retrieve items from distinct locations within a warehouse.This study introduces optimized metaheuristic strategies to address MPRRP,with the aim of minimizing travel distances,energy consumption,and order fulfillment time while ensuring operational efficiency.Advanced algorithms,including an enhanced Particle Swarm Optimization(PSO-MPRRP)and a tailored Genetic Algorithm(GA-MPRRP),are specifically designed with customized evolutionary operators to effectively solve the MPRRP.Comparative experiments are conducted to evaluate the proposed strategies against benchmark approaches,demonstrating significant improvements in solution quality and computational efficiency.The findings contribute to the development of intelligent,scalable,and environmentally friendly warehouse systems,paving the way for future advances in robotics and automated logistics management.
文摘Aiming to solve the steering instability and hysteresis of agricultural robots in the process of movement,a fusion PID control method of particle swarm optimization(PSO)and genetic algorithm(GA)was proposed.The fusion algorithm took advantage of the fast optimization ability of PSO to optimize the population screening link of GA.The Simulink simulation results showed that the convergence of the fitness function of the fusion algorithm was accelerated,the system response adjustment time was reduced,and the overshoot was almost zero.Then the algorithm was applied to the steering test of agricultural robot in various scenes.After modeling the steering system of agricultural robot,the steering test results in the unloaded suspended state showed that the PID control based on fusion algorithm reduced the rise time,response adjustment time and overshoot of the system,and improved the response speed and stability of the system,compared with the artificial trial and error PID control and the PID control based on GA.The actual road steering test results showed that the PID control response rise time based on the fusion algorithm was the shortest,about 4.43 s.When the target pulse number was set to 100,the actual mean value in the steady-state regulation stage was about 102.9,which was the closest to the target value among the three control methods,and the overshoot was reduced at the same time.The steering test results under various scene states showed that the PID control based on the proposed fusion algorithm had good anti-interference ability,it can adapt to the changes of environment and load and improve the performance of the control system.It was effective in the steering control of agricultural robot.This method can provide a reference for the precise steering control of other robots.
基金supported by the National Natural Science Foundation of China(Grant No.12402139,No.52368070)supported by Hainan Provincial Natural Science Foundation of China(Grant No.524QN223)+3 种基金Scientific Research Startup Foundation of Hainan University(Grant No.RZ2300002710)State Key Laboratory of Structural Analysis,Optimization and CAE Software for Industrial Equipment,Dalian University of Technology(Grant No.GZ24107)the Horizontal Research Project(Grant No.HD-KYH-2024022)Innovative Research Projects for Postgraduate Students in Hainan Province(Grant No.Hys2025-217).
文摘Optimization problems are prevalent in various fields of science and engineering,with several real-world applications characterized by high dimensionality and complex search landscapes.Starfish optimization algorithm(SFOA)is a recently optimizer inspired by swarm intelligence,which is effective for numerical optimization,but it may encounter premature and local convergence for complex optimization problems.To address these challenges,this paper proposes the multi-strategy enhanced crested porcupine-starfish optimization algorithm(MCPSFOA).The core innovation of MCPSFOA lies in employing a hybrid strategy to improve SFOA,which integrates the exploratory mechanisms of SFOA with the diverse search capacity of the Crested Porcupine Optimizer(CPO).This synergy enhances MCPSFOA’s ability to navigate complex and multimodal search spaces.To further prevent premature convergence,MCPSFOA incorporates Lévy flight,leveraging its characteristic long and short jump patterns to enable large-scale exploration and escape from local optima.Subsequently,Gaussian mutation is applied for precise solution tuning,introducing controlled perturbations that enhance accuracy and mitigate the risk of insufficient exploitation.Notably,the population diversity enhancement mechanism periodically identifies and resets stagnant individuals,thereby consistently revitalizing population variety throughout the optimization process.MCPSFOA is rigorously evaluated on 24 classical benchmark functions(including high-dimensional cases),the CEC2017 suite,and the CEC2022 suite.MCPSFOA achieves superior overall performance with Friedman mean ranks of 2.208,2.310 and 2.417 on these benchmark functions,outperforming 11 state-of-the-art algorithms.Furthermore,the practical applicability of MCPSFOA is confirmed through its successful application to five engineering optimization cases,where it also yields excellent results.In conclusion,MCPSFOA is not only a highly effective and reliable optimizer for benchmark functions,but also a practical tool for solving real-world optimization problems.
文摘Multi-label feature selection(MFS)is a crucial dimensionality reduction technique aimed at identifying informative features associated with multiple labels.However,traditional centralized methods face significant challenges in privacy-sensitive and distributed settings,often neglecting label dependencies and suffering from low computational efficiency.To address these issues,we introduce a novel framework,Fed-MFSDHBCPSO—federated MFS via dual-layer hybrid breeding cooperative particle swarm optimization algorithm with manifold and sparsity regularization(DHBCPSO-MSR).Leveraging the federated learning paradigm,Fed-MFSDHBCPSO allows clients to perform local feature selection(FS)using DHBCPSO-MSR.Locally selected feature subsets are encrypted with differential privacy(DP)and transmitted to a central server,where they are securely aggregated and refined through secure multi-party computation(SMPC)until global convergence is achieved.Within each client,DHBCPSO-MSR employs a dual-layer FS strategy.The inner layer constructs sample and label similarity graphs,generates Laplacian matrices to capture the manifold structure between samples and labels,and applies L2,1-norm regularization to sparsify the feature subset,yielding an optimized feature weight matrix.The outer layer uses a hybrid breeding cooperative particle swarm optimization algorithm to further refine the feature weight matrix and identify the optimal feature subset.The updated weight matrix is then fed back to the inner layer for further optimization.Comprehensive experiments on multiple real-world multi-label datasets demonstrate that Fed-MFSDHBCPSO consistently outperforms both centralized and federated baseline methods across several key evaluation metrics.
基金supported by the Major Science and Technology Programs in Henan Province(No.241100210100)Henan Provincial Science and Technology Research Project(No.252102211085,No.252102211105)+3 种基金Endogenous Security Cloud Network Convergence R&D Center(No.602431011PQ1)The Special Project for Research and Development in Key Areas of Guangdong Province(No.2021ZDZX1098)The Stabilization Support Program of Science,Technology and Innovation Commission of Shenzhen Municipality(No.20231128083944001)The Key scientific research projects of Henan higher education institutions(No.24A520042).
文摘Existing feature selection methods for intrusion detection systems in the Industrial Internet of Things often suffer from local optimality and high computational complexity.These challenges hinder traditional IDS from effectively extracting features while maintaining detection accuracy.This paper proposes an industrial Internet ofThings intrusion detection feature selection algorithm based on an improved whale optimization algorithm(GSLDWOA).The aim is to address the problems that feature selection algorithms under high-dimensional data are prone to,such as local optimality,long detection time,and reduced accuracy.First,the initial population’s diversity is increased using the Gaussian Mutation mechanism.Then,Non-linear Shrinking Factor balances global exploration and local development,avoiding premature convergence.Lastly,Variable-step Levy Flight operator and Dynamic Differential Evolution strategy are introduced to improve the algorithm’s search efficiency and convergence accuracy in highdimensional feature space.Experiments on the NSL-KDD and WUSTL-IIoT-2021 datasets demonstrate that the feature subset selected by GSLDWOA significantly improves detection performance.Compared to the traditional WOA algorithm,the detection rate and F1-score increased by 3.68%and 4.12%.On the WUSTL-IIoT-2021 dataset,accuracy,recall,and F1-score all exceed 99.9%.
基金co-supported by the National Natural Science Foundation of China(Nos.62473110,62403166)the Fundamental Research Funds for the Central Universities,China(No.2023FRFK02043)+1 种基金the Natural Science Foundation of Heilongjiang Province,China(No.LH2022F023)the National Key Laboratory of Space Intelligent Control Foundation,China(No.2023-JCJQ-LB-006-19)。
文摘With the increasing number of geosynchronous orbit satellites with expiring lifetime,spacecraft refueling is crucial in enhancing the economic benefits of on-orbit services.The existing studies tend to be based on predetermined refueling duration;however,the precise mission scheduling solution will be difficult to apply due to uncertain refueling duration caused by orbital transfer deviations and stochastic actuator faults during actual on-orbit service.Therefore,this paper proposes a robust mission scheduling strategy for geosynchronous orbit spacecraft on-orbit refueling missions with uncertain refueling duration.Firstly,a robust mission scheduling model is constructed by introducing the budget uncertainty set to describe the uncertain refueling duration.Secondly,a hybrid harris hawks optimization algorithm is designed to explore the optimal mission allocation and refueling sequences,which combines cubic chaotic mapping to initialize the population,and the crossover in the genetic algorithm is introduced to enhance global convergence.Finally,the typical simulation examples are constructed with real-mission scenarios in three aspects to analyze:performance comparisons with various algorithms;robustness analyses via comparisons of different on-orbit refueling durations;investigations into the impacts of different initial population strategies on algorithm performance,demonstrating the proposed mission scheduling framework's robustness and effectiveness by comparing it with the exact mission scheduling.
文摘Amidst the growing global emphasis on nuclear safety,the integrity of nuclear reactor systems has garnered attention in the aftermath of consequential events.Moreover,the rapid development of artificial intelligence technology has provided immense opportunities to enhance the safety and economy of nuclear energy.However,data-driven deep learning techniques often lack interpretability,which hinders their applicability in the nuclear energy sector.To address this problem,this study proposes a hybrid data-driven and knowledge-driven artificial intelligence model based on physics-informed neural networks to accurately compute the neutron flux distribution inside a nuclear reactor core.Innovative techniques,such as regional decomposition,intelligent k_(eff)(effective multiplication factor)search,and k_(eff)inversion,have been introduced for the calculation.Furthermore,hyperparameters of the model are automatically optimized using a whale optimization algorithm.A series of computational examples are used to validate the proposed model,demonstrating its applicability,generality,and high accuracy in calculating the neutron flux within the nuclear reactor.The model offers a dependable strategy for computing the neutron flux distribution in nuclear reactors for advanced simulation techniques in the future,including reactor digital twinning.This approach is data-light,requires little to no training data,and still delivers remarkably precise output data.
基金Project(52005358)supported by the National Natural Science Foundation of ChinaProject(2018YFB1307902)supported by the National Key R&D Program of China+1 种基金Project(201901D111243)supported by the Natural Science Foundation of Shanxi Province,ChinaProject(2019-KF-25-05)supported by the Natural Science Foundation of Liaoning Province,China。
文摘To make up the poor quality defects of traditional control methods and meet the growing requirements of accuracy for strip crown,an optimized model based on support vector machine(SVM)is put forward firstly to enhance the quality of product in hot strip rolling.Meanwhile,for enriching data information and ensuring data quality,experimental data were collected from a hot-rolled plant to set up prediction models,as well as the prediction performance of models was evaluated by calculating multiple indicators.Furthermore,the traditional SVM model and the combined prediction models with particle swarm optimization(PSO)algorithm and the principal component analysis combined with cuckoo search(PCA-CS)optimization strategies are presented to make a comparison.Besides,the prediction performance comparisons of the three models are discussed.Finally,the experimental results revealed that the PCA-CS-SVM model has the highest prediction accuracy and the fastest convergence speed.Furthermore,the root mean squared error(RMSE)of PCA-CS-SVM model is 2.04μm,and 98.15%of prediction data have an absolute error of less than 4.5μm.Especially,the results also proved that PCA-CS-SVM model not only satisfies precision requirement but also has certain guiding significance for the actual production of hot strip rolling.