Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing h...Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.展开更多
With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality pred...With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.展开更多
Under the partial shading conditions(PSC)of Photovoltaic(PV)modules in a PV hybrid system,the power output curve exhibits multiple peaks.This often causes traditional maximum power point tracking(MPPT)methods to fall ...Under the partial shading conditions(PSC)of Photovoltaic(PV)modules in a PV hybrid system,the power output curve exhibits multiple peaks.This often causes traditional maximum power point tracking(MPPT)methods to fall into local optima and fail to find the global optimum.To address this issue,a composite MPPT algorithm is proposed.It combines the improved kepler optimization algorithm(IKOA)with the optimized variable-step perturb and observe(OIP&O).The update probabilities,planetary velocity and position step coefficients of IKOA are nonlinearly and adaptively optimized.This adaptation meets the varying needs of the initial and later stages of the iterative process and accelerates convergence.During stochastic exploration,the refined position update formulas enhance diversity and global search capability.The improvements in the algorithmreduces the likelihood of falling into local optima.In the later stages,the OIP&O algorithm decreases oscillation and increases accuracy.compared with cuckoo search(CS)and gray wolf optimization(GWO),simulation tests of the PV hybrid inverter demonstrate that the proposed IKOA-OIP&O algorithm achieves faster convergence and greater stability under static,local and dynamic shading conditions.These results can confirm the feasibility and effectiveness of the proposed PV MPPT algorithm for PV hybrid systems.展开更多
The denoising of microseismic signals is a prerequisite for subsequent analysis and research.In this research,a new microseismic signal denoising algorithm called the Black Widow Optimization Algorithm(BWOA)optimized ...The denoising of microseismic signals is a prerequisite for subsequent analysis and research.In this research,a new microseismic signal denoising algorithm called the Black Widow Optimization Algorithm(BWOA)optimized VariationalMode Decomposition(VMD)jointWavelet Threshold Denoising(WTD)algorithm(BVW)is proposed.The BVW algorithm integrates VMD and WTD,both of which are optimized by BWOA.Specifically,this algorithm utilizes VMD to decompose the microseismic signal to be denoised into several Band-Limited IntrinsicMode Functions(BLIMFs).Subsequently,these BLIMFs whose correlation coefficients with the microseismic signal to be denoised are higher than a threshold are selected as the effective mode functions,and the effective mode functions are denoised using WTD to filter out the residual low-and intermediate-frequency noise.Finally,the denoised microseismic signal is obtained through reconstruction.The ideal values of VMD parameters and WTD parameters are acquired by searching with BWOA to achieve the best VMD decomposition performance and solve the problem of relying on experience and requiring a large workload in the application of the WTD algorithm.The outcomes of simulated experiments indicate that this algorithm is capable of achieving good denoising performance under noise of different intensities,and the denoising performance is significantly better than the commonly used VMD and Empirical Mode Decomposition(EMD)algorithms.The BVW algorithm is more efficient in filtering noise,the waveform after denoising is smoother,the amplitude of the waveform is the closest to the original signal,and the signal-to-noise ratio(SNR)and the root mean square error after denoising are more satisfying.The case based on Fankou Lead-Zinc Mine shows that for microseismic signals with different intensities of noise monitored on-site,compared with VMD and EMD,the BVW algorithm ismore efficient in filtering noise,and the SNR after denoising is higher.展开更多
Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detec...Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection.展开更多
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th...Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.展开更多
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol...Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.展开更多
Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,th...Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,this paper proposes a grid-connected/island switching control strategy for photovoltaic storage hybrid inverters based on the modified chimpanzee optimization algorithm.The proposed strategy incorporates coupling compensation and power differentiation elements based on the traditional droop control.Then,it combines the angular frequency and voltage amplitude adjustments provided by the phase-locked loop-free pre-synchronization control strategy.Precise pre-synchronization is achieved by regulating the virtual current to zero and aligning the photovoltaic storage hybrid inverter with the grid voltage.Additionally,two novel operators,learning and emotional behaviors are introduced to enhance the optimization precision of the chimpanzee algorithm.These operators ensure high-precision and high-reliability optimization of the droop control parameters for photovoltaic storage hybrid inverters.A Simulink model was constructed for simulation analysis,which validated the optimized control strategy’s ability to evenly distribute power under load transients.This strategy effectively mitigated transient voltage and current surges during mode transitions.Consequently,seamless and efficient switching between gridconnected and island modes was achieved for the photovoltaic storage hybrid inverter.The enhanced energy utilization efficiency,in turn,offers robust technical support for grid stability.展开更多
Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality ...Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality solutions efficiently by balancing exploration of the search space and exploitation of promising solutions.While heuristic optimization algorithms vary in their specific details,they often exhibit common patterns that are essential to their effectiveness.This paper aims to analyze and explore common patterns in heuristic optimization algorithms.Through a comprehensive review of the literature,we identify the patterns that are commonly observed in these algorithms,including initialization,local search,diversity maintenance,adaptation,and stochasticity.For each pattern,we describe the motivation behind it,its implementation,and its impact on the search process.To demonstrate the utility of our analysis,we identify these patterns in multiple heuristic optimization algorithms.For each case study,we analyze how the patterns are implemented in the algorithm and how they contribute to its performance.Through these case studies,we show how our analysis can be used to understand the behavior of heuristic optimization algorithms and guide the design of new algorithms.Our analysis reveals that patterns in heuristic optimization algorithms are essential to their effectiveness.By understanding and incorporating these patterns into the design of new algorithms,researchers can develop more efficient and effective optimization algorithms.展开更多
The uncertain nature of mapping user tasks to Virtual Machines(VMs) causes system failure or execution delay in Cloud Computing.To maximize cloud resource throughput and decrease user response time,load balancing is n...The uncertain nature of mapping user tasks to Virtual Machines(VMs) causes system failure or execution delay in Cloud Computing.To maximize cloud resource throughput and decrease user response time,load balancing is needed.Possible load balancing is needed to overcome user task execution delay and system failure.Most swarm intelligent dynamic load balancing solutions that used hybrid metaheuristic algorithms failed to balance exploitation and exploration.Most load balancing methods were insufficient to handle the growing uncertainty in job distribution to VMs.Thus,the Hybrid Spotted Hyena and Whale Optimization Algorithm-based Dynamic Load Balancing Mechanism(HSHWOA) partitions traffic among numerous VMs or servers to guarantee user chores are completed quickly.This load balancing approach improved performance by considering average network latency,dependability,and throughput.This hybridization of SHOA and WOA aims to improve the trade-off between exploration and exploitation,assign jobs to VMs with more solution diversity,and prevent the solution from reaching a local optimality.Pysim-based experimental verification and testing for the proposed HSHWOA showed a 12.38% improvement in minimized makespan,16.21% increase in mean throughput,and 14.84% increase in network stability compared to baseline load balancing strategies like Fractional Improved Whale Social Optimization Based VM Migration Strategy FIWSOA,HDWOA,and Binary Bird Swap.展开更多
Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple dat...Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.展开更多
This research presents a novel nature-inspired metaheuristic optimization algorithm,called theNarwhale Optimization Algorithm(NWOA).The algorithm draws inspiration from the foraging and prey-hunting strategies of narw...This research presents a novel nature-inspired metaheuristic optimization algorithm,called theNarwhale Optimization Algorithm(NWOA).The algorithm draws inspiration from the foraging and prey-hunting strategies of narwhals,“unicorns of the sea”,particularly the use of their distinctive spiral tusks,which play significant roles in hunting,searching prey,navigation,echolocation,and complex social interaction.Particularly,the NWOA imitates the foraging strategies and techniques of narwhals when hunting for prey but focuses mainly on the cooperative and exploratory behavior shown during group hunting and in the use of their tusks in sensing and locating prey under the Arctic ice.These functions provide a strong assessment basis for investigating the algorithm’s prowess at balancing exploration and exploitation,convergence speed,and solution accuracy.The performance of the NWOA is evaluated on 30 benchmark test functions.A comparison study using the Grey Wolf Optimizer(GWO),Whale Optimization Algorithm(WOA),Perfumer Optimization Algorithm(POA),Candle Flame Optimization(CFO)Algorithm,Particle Swarm Optimization(PSO)Algorithm,and Genetic Algorithm(GA)validates the results.As evidenced in the experimental results,NWOA is capable of yielding competitive outcomes among these well-known optimizers,whereas in several instances.These results suggest thatNWOAhas proven to be an effective and robust optimization tool suitable for solving many different complex optimization problems from the real world.展开更多
Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in comm...Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in communication and its potential applications in various areas such as network optimization,signal processing,and machine learning for communication.First,the basic principle of quantum computing,quantum physics systems,and quantum algorithms are analyzed.Then,based on the classification of quantum algorithms,several important basic quantum algorithms,quantum optimization algorithms,and quantum machine learning algorithms are discussed in detail.Finally,the basic ideas and feasibility of introducing quantum algorithms into communications are emphatically analyzed,which provides a reference to address computational bottlenecks in communication networks.展开更多
A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANE...A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANET security,network longevity,and energy efficiency.Hence,selecting an appropriate cluster.The cluster’s head further boosts the network’s energy effectiveness.As a result,a Hybrid Swallow Swarm Optimisation-Memetic Algorithm(SSO-MA)is suggested to develop the energy efficiency&of the MANET network.Then,to secure the network Abnormality Detection System(ADS)is proposed.The MATLAB-2021a platform is used to implement the suggested technique and conduct the analysis.In terms of network performance,the suggested model outperforms the current Genetic Algorithm,Optimised Link State Routing protocol,and Particle Swarm Optimisation techniques.The performance of the model has a minimum delay in the range of 0.82 seconds and a Packet Delivery Ratio(PDR)of 99.82%.Hence,the validation shows that the Hybrid SSO-MA strategy is superior to the other approaches in terms of efficiency.展开更多
Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software ...Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software defect prediction can be effectively performed using traditional features,but there are some redundant or irrelevant features in them(the presence or absence of this feature has little effect on the prediction results).These problems can be solved using feature selection.However,existing feature selection methods have shortcomings such as insignificant dimensionality reduction effect and low classification accuracy of the selected optimal feature subset.In order to reduce the impact of these shortcomings,this paper proposes a new feature selection method Cubic TraverseMa Beluga whale optimization algorithm(CTMBWO)based on the improved Beluga whale optimization algorithm(BWO).The goal of this study is to determine how well the CTMBWO can extract the features that are most important for correctly predicting software defects,improve the accuracy of fault prediction,reduce the number of the selected feature and mitigate the risk of overfitting,thereby achieving more efficient resource utilization and better distribution of test workload.The CTMBWO comprises three main stages:preprocessing the dataset,selecting relevant features,and evaluating the classification performance of the model.The novel feature selection method can effectively improve the performance of SDP.This study performs experiments on two software defect datasets(PROMISE,NASA)and shows the method’s classification performance using four detailed evaluation metrics,Accuracy,F1-score,MCC,AUC and Recall.The results indicate that the approach presented in this paper achieves outstanding classification performance on both datasets and has significant improvement over the baseline models.展开更多
Aiming to address the limitations of the standard Chimp Optimization Algorithm(ChOA),such as inadequate search ability and susceptibility to local optima in Unmanned Aerial Vehicle(UAV)path planning,this paper propose...Aiming to address the limitations of the standard Chimp Optimization Algorithm(ChOA),such as inadequate search ability and susceptibility to local optima in Unmanned Aerial Vehicle(UAV)path planning,this paper proposes a three-dimensional path planning method for UAVs based on the Improved Chimp Optimization Algorithm(IChOA).First,this paper models the terrain and obstacle environments spatially and formulates the total UAV flight cost function according to the constraints,transforming the path planning problem into an optimization problem with multiple constraints.Second,this paper enhances the diversity of the chimpanzee population by applying the Sine chaos mapping strategy and introduces a nonlinear convergence factor to improve the algorithm’s search accuracy and convergence speed.Finally,this paper proposes a dynamic adjustment strategy for the number of chimpanzee advance echelons,which effectively balances global exploration and local exploitation,significantly optimizing the algorithm’s search performance.To validate the effectiveness of the IChOA algorithm,this paper conducts experimental comparisons with eight different intelligent algorithms.The experimental results demonstrate that the IChOA outperforms the selected comparison algorithms in terms of practicality and robustness in UAV 3D path planning.It effectively solves the issues of efficiency in finding the shortest path and ensures high stability during execution.展开更多
The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting i...The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting is the foundation of conventional pain assessment methods,which may be unreliable.Deep learning is a promising alternative to resolve this limitation through automated pain classification.This paper proposes an ensemble deep-learning framework for pain assessment.The framework makes use of features collected from electromyography(EMG),skin conductance level(SCL),and electrocardiography(ECG)signals.We integrate Convolutional Neural Networks(CNN),Long Short-Term Memory Networks(LSTM),Bidirectional Gated Recurrent Units(BiGRU),and Deep Neural Networks(DNN)models.We then aggregate their predictions using a weighted averaging ensemble technique to increase the classification’s robustness.To improve computing efficiency and remove redundant features,we use Particle Swarm Optimization(PSO)for feature selection.This enables us to reduce the features’dimensionality without sacrificing the classification’s accuracy.With improved accuracy,precision,recall,and F1-score across all pain levels,the experimental results show that the suggested ensemble model performs better than individual deep learning classifiers.In our experiments,the suggested model achieved over 98%accuracy,suggesting promising automated pain assessment performance.However,due to differences in validation protocols,comparisons with previous studies are still limited.Combining deep learning and feature selection techniques significantly improves model generalization,reducing overfitting and enhancing classification performance.The evaluation was conducted using the BioVid Heat Pain Dataset,confirming the model’s effectiveness in distinguishing between different pain intensity levels.展开更多
Multi-objective optimization is critical for problem-solving in engineering,economics,and AI.This study introduces the Multi-Objective Chef-Based Optimization Algorithm(MOCBOA),an upgraded version of the Chef-Based Op...Multi-objective optimization is critical for problem-solving in engineering,economics,and AI.This study introduces the Multi-Objective Chef-Based Optimization Algorithm(MOCBOA),an upgraded version of the Chef-Based Optimization Algorithm(CBOA)that addresses distinct objectives.Our approach is unique in systematically examining four dominance relations—Pareto,Epsilon,Cone-epsilon,and Strengthened dominance—to evaluate their influence on sustaining solution variety and driving convergence toward the Pareto front.Our comparison investigation,which was conducted on fifty test problems from the CEC 2021 benchmark and applied to areas such as chemical engineering,mechanical design,and power systems,reveals that the dominance approach used has a considerable impact on the key optimization measures such as the hypervolume metric.This paper provides a solid foundation for determining themost effective dominance approach and significant insights for both theoretical research and practical applications in multi-objective optimization.展开更多
This study introduces a novel algorithm known as the dung beetle optimization algorithm based on bounded reflection optimization andmulti-strategy fusion(BFDBO),which is designed to tackle the complexities associated ...This study introduces a novel algorithm known as the dung beetle optimization algorithm based on bounded reflection optimization andmulti-strategy fusion(BFDBO),which is designed to tackle the complexities associated with multi-UAV collaborative trajectory planning in intricate battlefield environments.Initially,a collaborative planning cost function for the multi-UAV system is formulated,thereby converting the trajectory planning challenge into an optimization problem.Building on the foundational dung beetle optimization(DBO)algorithm,BFDBO incorporates three significant innovations:a boundary reflection mechanism,an adaptive mixed exploration strategy,and a dynamic multi-scale mutation strategy.These enhancements are intended to optimize the equilibrium between local exploration and global exploitation,facilitating the discovery of globally optimal trajectories thatminimize the cost function.Numerical simulations utilizing the CEC2022 benchmark function indicate that all three enhancements of BFDBOpositively influence its performance,resulting in accelerated convergence and improved optimization accuracy relative to leading optimization algorithms.In two battlefield scenarios of varying complexities,BFDBO achieved a minimum of a 39% reduction in total trajectory planning costs when compared to DBO and three other highperformance variants,while also demonstrating superior average runtime.This evidence underscores the effectiveness and applicability of BFDBO in practical,real-world contexts.展开更多
As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphas...As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphasized the need for adaptive clustering strategies to improve performance in Intelligent Transportation Systems(ITS).This paper presents the Grasshopper Optimization Algorithm for Vehicular Network Clustering(GOAVNET)algorithm,an innovative approach to optimal vehicular clustering in Vehicular Ad-Hoc Networks(VANETs),leveraging the Grasshopper Optimization Algorithm(GOA)to address the critical challenges of traffic congestion and communication inefficiencies in Intelligent Transportation Systems(ITS).The proposed GOA-VNET employs an iterative and interactive optimization mechanism to dynamically adjust node positions and cluster configurations,ensuring robust adaptability to varying vehicular densities and transmission ranges.Key features of GOA-VNET include the utilization of attraction zone,repulsion zone,and comfort zone parameters,which collectively enhance clustering efficiency and minimize congestion within Regions of Interest(ROI).By managing cluster configurations and node densities effectively,GOA-VNET ensures balanced load distribution and seamless data transmission,even in scenarios with high vehicular densities and varying transmission ranges.Comparative evaluations against the Whale Optimization Algorithm(WOA)and Grey Wolf Optimization(GWO)demonstrate that GOA-VNET consistently outperforms these methods by achieving superior clustering efficiency,reducing the number of clusters by up to 10%in high-density scenarios,and improving data transmission reliability.Simulation results reveal that under a 100-600 m transmission range,GOA-VNET achieves an average reduction of 8%-15%in the number of clusters and maintains a 5%-10%improvement in packet delivery ratio(PDR)compared to baseline algorithms.Additionally,the algorithm incorporates a heat transfer-inspired load-balancing mechanism,ensuring equitable distribution of nodes among cluster leaders(CLs)and maintaining a stable network environment.These results validate GOA-VNET as a reliable and scalable solution for VANETs,with significant potential to support next-generation ITS.Future research could further enhance the algorithm by integrating multi-objective optimization techniques and exploring broader applications in complex traffic scenarios.展开更多
基金supported by the Research Incentive Grant 23200 of Zayed University,United Arab Emirates.
文摘Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.
基金supported by the National Science and Technology Innovation 2030 Next-Generation Artifical Intelligence Major Project(2018AAA0101801)the National Natural Science Foundation of China(72271188)。
文摘With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.
基金funding from the Graduate Practice Innovation Program of Jiangsu University of Technology(XSJCX23_58)Changzhou Science and Technology Support Project(CE20235045)Open Project of Jiangsu Key Laboratory of Power Transmission&Distribution Equipment Technology(2021JSSPD12).
文摘Under the partial shading conditions(PSC)of Photovoltaic(PV)modules in a PV hybrid system,the power output curve exhibits multiple peaks.This often causes traditional maximum power point tracking(MPPT)methods to fall into local optima and fail to find the global optimum.To address this issue,a composite MPPT algorithm is proposed.It combines the improved kepler optimization algorithm(IKOA)with the optimized variable-step perturb and observe(OIP&O).The update probabilities,planetary velocity and position step coefficients of IKOA are nonlinearly and adaptively optimized.This adaptation meets the varying needs of the initial and later stages of the iterative process and accelerates convergence.During stochastic exploration,the refined position update formulas enhance diversity and global search capability.The improvements in the algorithmreduces the likelihood of falling into local optima.In the later stages,the OIP&O algorithm decreases oscillation and increases accuracy.compared with cuckoo search(CS)and gray wolf optimization(GWO),simulation tests of the PV hybrid inverter demonstrate that the proposed IKOA-OIP&O algorithm achieves faster convergence and greater stability under static,local and dynamic shading conditions.These results can confirm the feasibility and effectiveness of the proposed PV MPPT algorithm for PV hybrid systems.
基金funded by the National Natural Science Foundation of China(Grant No.51874350)the National Natural Science Foundation of China(Grant No.52304127)+2 种基金the Fundamental Research Funds for the Central Universities of Central South University(Grant No.2020zzts200)the Science Foundation of the Fuzhou University(Grant No.511229)Fuzhou University Testing Fund of Precious Apparatus(Grant No.2024T040).
文摘The denoising of microseismic signals is a prerequisite for subsequent analysis and research.In this research,a new microseismic signal denoising algorithm called the Black Widow Optimization Algorithm(BWOA)optimized VariationalMode Decomposition(VMD)jointWavelet Threshold Denoising(WTD)algorithm(BVW)is proposed.The BVW algorithm integrates VMD and WTD,both of which are optimized by BWOA.Specifically,this algorithm utilizes VMD to decompose the microseismic signal to be denoised into several Band-Limited IntrinsicMode Functions(BLIMFs).Subsequently,these BLIMFs whose correlation coefficients with the microseismic signal to be denoised are higher than a threshold are selected as the effective mode functions,and the effective mode functions are denoised using WTD to filter out the residual low-and intermediate-frequency noise.Finally,the denoised microseismic signal is obtained through reconstruction.The ideal values of VMD parameters and WTD parameters are acquired by searching with BWOA to achieve the best VMD decomposition performance and solve the problem of relying on experience and requiring a large workload in the application of the WTD algorithm.The outcomes of simulated experiments indicate that this algorithm is capable of achieving good denoising performance under noise of different intensities,and the denoising performance is significantly better than the commonly used VMD and Empirical Mode Decomposition(EMD)algorithms.The BVW algorithm is more efficient in filtering noise,the waveform after denoising is smoother,the amplitude of the waveform is the closest to the original signal,and the signal-to-noise ratio(SNR)and the root mean square error after denoising are more satisfying.The case based on Fankou Lead-Zinc Mine shows that for microseismic signals with different intensities of noise monitored on-site,compared with VMD and EMD,the BVW algorithm ismore efficient in filtering noise,and the SNR after denoising is higher.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2024R 343)PrincessNourah bint Abdulrahman University,Riyadh,Saudi ArabiaDeanship of Scientific Research at Northern Border University,Arar,Kingdom of Saudi Arabia,for funding this researchwork through the project number“NBU-FFR-2024-1092-02”.
文摘Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection.
基金supported by Yunnan Provincial Basic Research Project(202401AT070344,202301AT070443)National Natural Science Foundation of China(62263014,52207105)+1 种基金Yunnan Lancang-Mekong International Electric Power Technology Joint Laboratory(202203AP140001)Major Science and Technology Projects in Yunnan Province(202402AG050006).
文摘Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.
基金supported by Science and Technology Innovation Programfor Postgraduate Students in IDP Subsidized by Fundamental Research Funds for the Central Universities(Project No.ZY20240335)support of the Research Project of the Key Technology of Malicious Code Detection Based on Data Mining in APT Attack(Project No.2022IT173)the Research Project of the Big Data Sensitive Information Supervision Technology Based on Convolutional Neural Network(Project No.2022011033).
文摘Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.
基金received funding from the Postgraduate Research&Practice Innovation Program of Jiangsu Province(SJCX23_1633)2023 University Student Innovation and Entrepreneurship Training Program(202311463009Z)+1 种基金Changzhou Science and Technology Support Project(CE20235045)Open Project of Jiangsu Key Laboratory of Power Transmission&Distribution Equipment Technology(2021JSSPD12).
文摘Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,this paper proposes a grid-connected/island switching control strategy for photovoltaic storage hybrid inverters based on the modified chimpanzee optimization algorithm.The proposed strategy incorporates coupling compensation and power differentiation elements based on the traditional droop control.Then,it combines the angular frequency and voltage amplitude adjustments provided by the phase-locked loop-free pre-synchronization control strategy.Precise pre-synchronization is achieved by regulating the virtual current to zero and aligning the photovoltaic storage hybrid inverter with the grid voltage.Additionally,two novel operators,learning and emotional behaviors are introduced to enhance the optimization precision of the chimpanzee algorithm.These operators ensure high-precision and high-reliability optimization of the droop control parameters for photovoltaic storage hybrid inverters.A Simulink model was constructed for simulation analysis,which validated the optimized control strategy’s ability to evenly distribute power under load transients.This strategy effectively mitigated transient voltage and current surges during mode transitions.Consequently,seamless and efficient switching between gridconnected and island modes was achieved for the photovoltaic storage hybrid inverter.The enhanced energy utilization efficiency,in turn,offers robust technical support for grid stability.
文摘Heuristic optimization algorithms have been widely used in solving complex optimization problems in various fields such as engineering,economics,and computer science.These algorithms are designed to find high-quality solutions efficiently by balancing exploration of the search space and exploitation of promising solutions.While heuristic optimization algorithms vary in their specific details,they often exhibit common patterns that are essential to their effectiveness.This paper aims to analyze and explore common patterns in heuristic optimization algorithms.Through a comprehensive review of the literature,we identify the patterns that are commonly observed in these algorithms,including initialization,local search,diversity maintenance,adaptation,and stochasticity.For each pattern,we describe the motivation behind it,its implementation,and its impact on the search process.To demonstrate the utility of our analysis,we identify these patterns in multiple heuristic optimization algorithms.For each case study,we analyze how the patterns are implemented in the algorithm and how they contribute to its performance.Through these case studies,we show how our analysis can be used to understand the behavior of heuristic optimization algorithms and guide the design of new algorithms.Our analysis reveals that patterns in heuristic optimization algorithms are essential to their effectiveness.By understanding and incorporating these patterns into the design of new algorithms,researchers can develop more efficient and effective optimization algorithms.
文摘The uncertain nature of mapping user tasks to Virtual Machines(VMs) causes system failure or execution delay in Cloud Computing.To maximize cloud resource throughput and decrease user response time,load balancing is needed.Possible load balancing is needed to overcome user task execution delay and system failure.Most swarm intelligent dynamic load balancing solutions that used hybrid metaheuristic algorithms failed to balance exploitation and exploration.Most load balancing methods were insufficient to handle the growing uncertainty in job distribution to VMs.Thus,the Hybrid Spotted Hyena and Whale Optimization Algorithm-based Dynamic Load Balancing Mechanism(HSHWOA) partitions traffic among numerous VMs or servers to guarantee user chores are completed quickly.This load balancing approach improved performance by considering average network latency,dependability,and throughput.This hybridization of SHOA and WOA aims to improve the trade-off between exploration and exploitation,assign jobs to VMs with more solution diversity,and prevent the solution from reaching a local optimality.Pysim-based experimental verification and testing for the proposed HSHWOA showed a 12.38% improvement in minimized makespan,16.21% increase in mean throughput,and 14.84% increase in network stability compared to baseline load balancing strategies like Fractional Improved Whale Social Optimization Based VM Migration Strategy FIWSOA,HDWOA,and Binary Bird Swap.
文摘Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.
文摘This research presents a novel nature-inspired metaheuristic optimization algorithm,called theNarwhale Optimization Algorithm(NWOA).The algorithm draws inspiration from the foraging and prey-hunting strategies of narwhals,“unicorns of the sea”,particularly the use of their distinctive spiral tusks,which play significant roles in hunting,searching prey,navigation,echolocation,and complex social interaction.Particularly,the NWOA imitates the foraging strategies and techniques of narwhals when hunting for prey but focuses mainly on the cooperative and exploratory behavior shown during group hunting and in the use of their tusks in sensing and locating prey under the Arctic ice.These functions provide a strong assessment basis for investigating the algorithm’s prowess at balancing exploration and exploitation,convergence speed,and solution accuracy.The performance of the NWOA is evaluated on 30 benchmark test functions.A comparison study using the Grey Wolf Optimizer(GWO),Whale Optimization Algorithm(WOA),Perfumer Optimization Algorithm(POA),Candle Flame Optimization(CFO)Algorithm,Particle Swarm Optimization(PSO)Algorithm,and Genetic Algorithm(GA)validates the results.As evidenced in the experimental results,NWOA is capable of yielding competitive outcomes among these well-known optimizers,whereas in several instances.These results suggest thatNWOAhas proven to be an effective and robust optimization tool suitable for solving many different complex optimization problems from the real world.
文摘Quantum computing is a promising technology that has the potential to revolutionize many areas of science and technology,including communication.In this review,we discuss the current state of quantum computing in communication and its potential applications in various areas such as network optimization,signal processing,and machine learning for communication.First,the basic principle of quantum computing,quantum physics systems,and quantum algorithms are analyzed.Then,based on the classification of quantum algorithms,several important basic quantum algorithms,quantum optimization algorithms,and quantum machine learning algorithms are discussed in detail.Finally,the basic ideas and feasibility of introducing quantum algorithms into communications are emphatically analyzed,which provides a reference to address computational bottlenecks in communication networks.
文摘A decentralized network made up of mobile nodes is termed the Mobile Ad-hoc Network(MANET).Mobility and a finite battery lifespan are the two main problems with MANETs.Advanced methods are essential for enhancing MANET security,network longevity,and energy efficiency.Hence,selecting an appropriate cluster.The cluster’s head further boosts the network’s energy effectiveness.As a result,a Hybrid Swallow Swarm Optimisation-Memetic Algorithm(SSO-MA)is suggested to develop the energy efficiency&of the MANET network.Then,to secure the network Abnormality Detection System(ADS)is proposed.The MATLAB-2021a platform is used to implement the suggested technique and conduct the analysis.In terms of network performance,the suggested model outperforms the current Genetic Algorithm,Optimised Link State Routing protocol,and Particle Swarm Optimisation techniques.The performance of the model has a minimum delay in the range of 0.82 seconds and a Packet Delivery Ratio(PDR)of 99.82%.Hence,the validation shows that the Hybrid SSO-MA strategy is superior to the other approaches in terms of efficiency.
文摘Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software defect prediction can be effectively performed using traditional features,but there are some redundant or irrelevant features in them(the presence or absence of this feature has little effect on the prediction results).These problems can be solved using feature selection.However,existing feature selection methods have shortcomings such as insignificant dimensionality reduction effect and low classification accuracy of the selected optimal feature subset.In order to reduce the impact of these shortcomings,this paper proposes a new feature selection method Cubic TraverseMa Beluga whale optimization algorithm(CTMBWO)based on the improved Beluga whale optimization algorithm(BWO).The goal of this study is to determine how well the CTMBWO can extract the features that are most important for correctly predicting software defects,improve the accuracy of fault prediction,reduce the number of the selected feature and mitigate the risk of overfitting,thereby achieving more efficient resource utilization and better distribution of test workload.The CTMBWO comprises three main stages:preprocessing the dataset,selecting relevant features,and evaluating the classification performance of the model.The novel feature selection method can effectively improve the performance of SDP.This study performs experiments on two software defect datasets(PROMISE,NASA)and shows the method’s classification performance using four detailed evaluation metrics,Accuracy,F1-score,MCC,AUC and Recall.The results indicate that the approach presented in this paper achieves outstanding classification performance on both datasets and has significant improvement over the baseline models.
基金supported by the Shaanxi Province Natural Science Basic Research Program Project(2024JC-YBMS-572)partially funded by Yan’an University Graduate Education Innovation Program Project(YCX2023032,YCX2023033,YCX2024094,YCX2024097)the“14th Five Year Plan Medium and Long Term Major Scientific Research Project”(2021ZCQ015)of Yan’an University.
文摘Aiming to address the limitations of the standard Chimp Optimization Algorithm(ChOA),such as inadequate search ability and susceptibility to local optima in Unmanned Aerial Vehicle(UAV)path planning,this paper proposes a three-dimensional path planning method for UAVs based on the Improved Chimp Optimization Algorithm(IChOA).First,this paper models the terrain and obstacle environments spatially and formulates the total UAV flight cost function according to the constraints,transforming the path planning problem into an optimization problem with multiple constraints.Second,this paper enhances the diversity of the chimpanzee population by applying the Sine chaos mapping strategy and introduces a nonlinear convergence factor to improve the algorithm’s search accuracy and convergence speed.Finally,this paper proposes a dynamic adjustment strategy for the number of chimpanzee advance echelons,which effectively balances global exploration and local exploitation,significantly optimizing the algorithm’s search performance.To validate the effectiveness of the IChOA algorithm,this paper conducts experimental comparisons with eight different intelligent algorithms.The experimental results demonstrate that the IChOA outperforms the selected comparison algorithms in terms of practicality and robustness in UAV 3D path planning.It effectively solves the issues of efficiency in finding the shortest path and ensures high stability during execution.
基金funded by the Deanship of Graduate Studies and Scientific Research at Jouf University under grant No.(DGSSR-2023-02-02341).
文摘The potential applications of multimodal physiological signals in healthcare,pain monitoring,and clinical decision support systems have garnered significant attention in biomedical research.Subjective self-reporting is the foundation of conventional pain assessment methods,which may be unreliable.Deep learning is a promising alternative to resolve this limitation through automated pain classification.This paper proposes an ensemble deep-learning framework for pain assessment.The framework makes use of features collected from electromyography(EMG),skin conductance level(SCL),and electrocardiography(ECG)signals.We integrate Convolutional Neural Networks(CNN),Long Short-Term Memory Networks(LSTM),Bidirectional Gated Recurrent Units(BiGRU),and Deep Neural Networks(DNN)models.We then aggregate their predictions using a weighted averaging ensemble technique to increase the classification’s robustness.To improve computing efficiency and remove redundant features,we use Particle Swarm Optimization(PSO)for feature selection.This enables us to reduce the features’dimensionality without sacrificing the classification’s accuracy.With improved accuracy,precision,recall,and F1-score across all pain levels,the experimental results show that the suggested ensemble model performs better than individual deep learning classifiers.In our experiments,the suggested model achieved over 98%accuracy,suggesting promising automated pain assessment performance.However,due to differences in validation protocols,comparisons with previous studies are still limited.Combining deep learning and feature selection techniques significantly improves model generalization,reducing overfitting and enhancing classification performance.The evaluation was conducted using the BioVid Heat Pain Dataset,confirming the model’s effectiveness in distinguishing between different pain intensity levels.
基金funded by Researchers Supporting Programnumber(RSPD2024R809),King Saud University,Riyadh,Saudi Arabia.
文摘Multi-objective optimization is critical for problem-solving in engineering,economics,and AI.This study introduces the Multi-Objective Chef-Based Optimization Algorithm(MOCBOA),an upgraded version of the Chef-Based Optimization Algorithm(CBOA)that addresses distinct objectives.Our approach is unique in systematically examining four dominance relations—Pareto,Epsilon,Cone-epsilon,and Strengthened dominance—to evaluate their influence on sustaining solution variety and driving convergence toward the Pareto front.Our comparison investigation,which was conducted on fifty test problems from the CEC 2021 benchmark and applied to areas such as chemical engineering,mechanical design,and power systems,reveals that the dominance approach used has a considerable impact on the key optimization measures such as the hypervolume metric.This paper provides a solid foundation for determining themost effective dominance approach and significant insights for both theoretical research and practical applications in multi-objective optimization.
基金funded by the National Defense Science and Technology Innovation project,grant number ZZKY20223103the Basic Frontier InnovationProject at the Engineering University of PAP,grant number WJY202429+2 种基金the Basic Frontier lnnovation Project at the Engineering University of PAP,grant number WJY202408the Graduate Student Funding Priority Project,grant number JYWJ2024B006Key project of National Social Science Foundation,grant number 2023-SKJJ-A-116.
文摘This study introduces a novel algorithm known as the dung beetle optimization algorithm based on bounded reflection optimization andmulti-strategy fusion(BFDBO),which is designed to tackle the complexities associated with multi-UAV collaborative trajectory planning in intricate battlefield environments.Initially,a collaborative planning cost function for the multi-UAV system is formulated,thereby converting the trajectory planning challenge into an optimization problem.Building on the foundational dung beetle optimization(DBO)algorithm,BFDBO incorporates three significant innovations:a boundary reflection mechanism,an adaptive mixed exploration strategy,and a dynamic multi-scale mutation strategy.These enhancements are intended to optimize the equilibrium between local exploration and global exploitation,facilitating the discovery of globally optimal trajectories thatminimize the cost function.Numerical simulations utilizing the CEC2022 benchmark function indicate that all three enhancements of BFDBOpositively influence its performance,resulting in accelerated convergence and improved optimization accuracy relative to leading optimization algorithms.In two battlefield scenarios of varying complexities,BFDBO achieved a minimum of a 39% reduction in total trajectory planning costs when compared to DBO and three other highperformance variants,while also demonstrating superior average runtime.This evidence underscores the effectiveness and applicability of BFDBO in practical,real-world contexts.
基金supported by Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2024-00337489Development of Data Drift Management Technology to Overcome Performance Degradation of AI Analysis Models).
文摘As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphasized the need for adaptive clustering strategies to improve performance in Intelligent Transportation Systems(ITS).This paper presents the Grasshopper Optimization Algorithm for Vehicular Network Clustering(GOAVNET)algorithm,an innovative approach to optimal vehicular clustering in Vehicular Ad-Hoc Networks(VANETs),leveraging the Grasshopper Optimization Algorithm(GOA)to address the critical challenges of traffic congestion and communication inefficiencies in Intelligent Transportation Systems(ITS).The proposed GOA-VNET employs an iterative and interactive optimization mechanism to dynamically adjust node positions and cluster configurations,ensuring robust adaptability to varying vehicular densities and transmission ranges.Key features of GOA-VNET include the utilization of attraction zone,repulsion zone,and comfort zone parameters,which collectively enhance clustering efficiency and minimize congestion within Regions of Interest(ROI).By managing cluster configurations and node densities effectively,GOA-VNET ensures balanced load distribution and seamless data transmission,even in scenarios with high vehicular densities and varying transmission ranges.Comparative evaluations against the Whale Optimization Algorithm(WOA)and Grey Wolf Optimization(GWO)demonstrate that GOA-VNET consistently outperforms these methods by achieving superior clustering efficiency,reducing the number of clusters by up to 10%in high-density scenarios,and improving data transmission reliability.Simulation results reveal that under a 100-600 m transmission range,GOA-VNET achieves an average reduction of 8%-15%in the number of clusters and maintains a 5%-10%improvement in packet delivery ratio(PDR)compared to baseline algorithms.Additionally,the algorithm incorporates a heat transfer-inspired load-balancing mechanism,ensuring equitable distribution of nodes among cluster leaders(CLs)and maintaining a stable network environment.These results validate GOA-VNET as a reliable and scalable solution for VANETs,with significant potential to support next-generation ITS.Future research could further enhance the algorithm by integrating multi-objective optimization techniques and exploring broader applications in complex traffic scenarios.