Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr...Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.展开更多
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th...Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.展开更多
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol...Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.展开更多
Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered so...Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design.展开更多
To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The...To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks.展开更多
BACKGROUND Esophageal squamous cell carcinoma is a major histological subtype of esophageal cancer.Many molecular genetic changes are associated with its occurrence.Raman spectroscopy has become a new method for the e...BACKGROUND Esophageal squamous cell carcinoma is a major histological subtype of esophageal cancer.Many molecular genetic changes are associated with its occurrence.Raman spectroscopy has become a new method for the early diagnosis of tumors because it can reflect the structures of substances and their changes at the molecular level.AIM To detect alterations in Raman spectral information across different stages of esophageal neoplasia.METHODS Different grades of esophageal lesions were collected,and a total of 360 groups of Raman spectrum data were collected.A 1D-transformer network model was proposed to handle the task of classifying the spectral data of esophageal squamous cell carcinoma.In addition,a deep learning model was applied to visualize the Raman spectral data and interpret their molecular characteristics.RESULTS A comparison among Raman spectral data with different pathological grades and a visual analysis revealed that the Raman peaks with significant differences were concentrated mainly at 1095 cm^(-1)(DNA,symmetric PO,and stretching vibration),1132 cm^(-1)(cytochrome c),1171 cm^(-1)(acetoacetate),1216 cm^(-1)(amide III),and 1315 cm^(-1)(glycerol).A comparison among the training results of different models revealed that the 1Dtransformer network performed best.A 93.30%accuracy value,a 96.65%specificity value,a 93.30%sensitivity value,and a 93.17%F1 score were achieved.CONCLUSION Raman spectroscopy revealed significantly different waveforms for the different stages of esophageal neoplasia.The combination of Raman spectroscopy and deep learning methods could significantly improve the accuracy of classification.展开更多
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base...In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.展开更多
The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making co...The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making collision avoidance significantly more challenging than that for space debris.Much existing research focuses on the continuous thrust model,whereas the impulsive maneuver model is more appropriate for long-duration and long-distance avoidance missions.Additionally,it is important to minimize the impact on the original mission while avoiding noncooperative targets.On the other hand,the existing avoidance algorithms are computationally complex and time-consuming especially with the limited computing capability of the on-board computer,posing challenges for practical engineering applications.To conquer these difficulties,this paper makes the following key contributions:(A)a turn-based(sequential decision-making)limited-area impulsive collision avoidance model considering the time delay of precision orbit determination is established for the first time;(B)a novel Selection Probability Learning Adaptive Search-depth Search Tree(SPL-ASST)algorithm is proposed for non-cooperative target avoidance,which improves the decision-making efficiency by introducing an adaptive-search-depth mechanism and a neural network into the traditional Monte Carlo Tree Search(MCTS).Numerical simulations confirm the effectiveness and efficiency of the proposed method.展开更多
The improved cross-correlation algorithm for the strain demodulation of Vernier-effect-based optical fiber sensor(VE-OFS)is proposed in this article.The algorithm identifies the most similar spectrum to the measured o...The improved cross-correlation algorithm for the strain demodulation of Vernier-effect-based optical fiber sensor(VE-OFS)is proposed in this article.The algorithm identifies the most similar spectrum to the measured one from the database of the collected spectra by employing the cross-correlation operation,subsequently deriving the predicted value via weighted calculation.As the algorithm uses the complete information in the measured raw spectrum,more accurate results and larger measurement range can be obtained.Additionally,the improved cross-correlation algorithm also has the potential to improve the measurement speed compared to current standards due to the possibility for the collection using low sampling rate.This work presents an important algorithm towards a simpler,faster way to improve the demodulation performance of VE-OFS.展开更多
Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,th...Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,this paper proposes a grid-connected/island switching control strategy for photovoltaic storage hybrid inverters based on the modified chimpanzee optimization algorithm.The proposed strategy incorporates coupling compensation and power differentiation elements based on the traditional droop control.Then,it combines the angular frequency and voltage amplitude adjustments provided by the phase-locked loop-free pre-synchronization control strategy.Precise pre-synchronization is achieved by regulating the virtual current to zero and aligning the photovoltaic storage hybrid inverter with the grid voltage.Additionally,two novel operators,learning and emotional behaviors are introduced to enhance the optimization precision of the chimpanzee algorithm.These operators ensure high-precision and high-reliability optimization of the droop control parameters for photovoltaic storage hybrid inverters.A Simulink model was constructed for simulation analysis,which validated the optimized control strategy’s ability to evenly distribute power under load transients.This strategy effectively mitigated transient voltage and current surges during mode transitions.Consequently,seamless and efficient switching between gridconnected and island modes was achieved for the photovoltaic storage hybrid inverter.The enhanced energy utilization efficiency,in turn,offers robust technical support for grid stability.展开更多
Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To sa...Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To satisfy quality of service(QoS)requirements of various users,it is critical to research efficient routing strategies to fully utilize satellite resources.This paper proposes a multi-QoS information optimized routing algorithm based on reinforcement learning for LEO satellite networks,which guarantees high level assurance demand services to be prioritized under limited satellite resources while considering the load balancing performance of the satellite networks for low level assurance demand services to ensure the full and effective utilization of satellite resources.An auxiliary path search algorithm is proposed to accelerate the convergence of satellite routing algorithm.Simulation results show that the generated routing strategy can timely process and fully meet the QoS demands of high assurance services while effectively improving the load balancing performance of the link.展开更多
The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade tempe...The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade temperature regulation performance.To address these challenges,we propose a composite control scheme combining fuzzy logic and a variable-gain generalized supertwisting algorithm(VG-GSTA).Firstly,a one-dimensional(1D)fuzzy logic controler(FLC)for the pump ensures stable coolant flow,while a two-dimensional(2D)FLC for the fan regulates the stack temperature near the reference value.The VG-GSTA is then introduced to eliminate steady-state errors,offering resistance to disturbances and minimizing control oscillations.The equilibrium optimizer is used to fine-tune VG-GSTA parameters.Co-simulation verifies the effectiveness of our method,demonstrating its advantages in terms of disturbance immunity,overshoot suppression,tracking accuracy and response speed.展开更多
In the deployment of wireless networks in two-dimensional outdoor campus spaces,aiming at the problem of efficient coverage of the monitoring area by limited number of access points(APs),this paper proposes a deployme...In the deployment of wireless networks in two-dimensional outdoor campus spaces,aiming at the problem of efficient coverage of the monitoring area by limited number of access points(APs),this paper proposes a deployment method of multi-objective optimization with virtual force fusion bat algorithm(VFBA)using the classical four-node regular distribution as an entry point.The introduction of Lévy flight strategy for bat position updating helps to maintain the population diversity,reduce the premature maturity problem caused by population convergence,avoid the over aggregation of individuals in the local optimal region,and enhance the superiority in global search;the virtual force algorithm simulates the attraction and repulsion between individuals,which enables individual bats to precisely locate the optimal solution within the search space.At the same time,the fusion effect of virtual force prompts the bat individuals to move faster to the potential optimal solution.To validate the effectiveness of the fusion algorithm,the benchmark test function is selected for simulation testing.Finally,the simulation result verifies that the VFBA achieves superior coverage and effectively reduces node redundancy compared to the other three regular layout methods.The VFBA also shows better coverage results when compared to other optimization algorithms.展开更多
The traditional A^(*)algorithm exhibits a low efficiency in the path planning of unmanned surface vehicles(USVs).In addition,the path planned presents numerous redundant inflection waypoints,and the security is low,wh...The traditional A^(*)algorithm exhibits a low efficiency in the path planning of unmanned surface vehicles(USVs).In addition,the path planned presents numerous redundant inflection waypoints,and the security is low,which is not conducive to the control of USV and also affects navigation safety.In this paper,these problems were addressed through the following improvements.First,the path search angle and security were comprehensively considered,and a security expansion strategy of nodes based on the 5×5 neighborhood was proposed.The A^(*)algorithm search neighborhood was expanded from 3×3 to 5×5,and safe nodes were screened out for extension via the node security expansion strategy.This algorithm can also optimize path search angles while improving path security.Second,the distance from the current node to the target node was introduced into the heuristic function.The efficiency of the A^(*)algorithm was improved,and the path was smoothed using the Floyd algorithm.For the dynamic adjustment of the weight to improve the efficiency of DWA,the distance from the USV to the target point was introduced into the evaluation function of the dynamic-window approach(DWA)algorithm.Finally,combined with the local target point selection strategy,the optimized DWA algorithm was performed for local path planning.The experimental results show the smooth and safe path planned by the fusion algorithm,which can successfully avoid dynamic obstacles and is effective and feasible in path planning for USVs.展开更多
This study evaluates the undrained uplift capacity of open-caisson anchors embedded in anisotropic clay using Finite Element Limit Analysis(FELA)and a hybrid machine learning framework.The FELA simulations inves-tigat...This study evaluates the undrained uplift capacity of open-caisson anchors embedded in anisotropic clay using Finite Element Limit Analysis(FELA)and a hybrid machine learning framework.The FELA simulations inves-tigate the influence of the radius ratio(R/B),anisotropic ratio(re),interface roughness factor(α),and inclination angle(β).Specifically,the results reveal that increasingβsignificantly enhances Nc,especially as soil behavior approaches isotropy.Higherαimproves resistance at steeper inclinations by mobilizing greater interface shear.Nc increases with re,reflecting enhanced strength under isotropic conditions.To enhance predictive accuracy and generalization,a hybrid machine learning model was developed by integrating Extreme Gradient Boosting(XGBoost)with Genetic Algorithm(GA)and Mutation-Based Genetic Algorithm(MGA)for hyperparameter tuning.Among the models,MGA-XGBoost outperformed GA-XGBoost,achieving higher predictive accuracy(R^(2)=0.996 training,0.993 testing).Furthermore,SHAP analysis consistently identified anisotropic ratio(re)as the most influential factor in predicting uplift capacity,followed by interface roughness factor(α),inclination angle(β),and radius ratio(R/B).The proposed framework serves as a scalable decision-support tool adaptable to various soil types and foundation geometries,offering a more efficient and data-driven approach to uplift-resistant design in anisotropic cohesive soils.展开更多
The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and stron...The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and strong global search capabilities,this algorithm finds application across diverse optimization problem domains.However,in the face of increasingly complex optimization challenges,the Bat algorithm encounters certain limitations,such as slow convergence and sensitivity to initial solutions.In order to tackle these challenges,the present study incorporates a range of optimization compo-nents into the Bat algorithm,thereby proposing a variant called PKEBA.A projection screening strategy is implemented to mitigate its sensitivity to initial solutions,thereby enhancing the quality of the initial solution set.A kinetic adaptation strategy reforms exploration patterns,while an elite communication strategy enhances group interaction,to avoid algorithm from local optima.Subsequently,the effectiveness of the proposed PKEBA is rigorously evaluated.Testing encompasses 30 benchmark functions from IEEE CEC2014,featuring ablation experiments and comparative assessments against classical algorithms and their variants.Moreover,real-world engineering problems are employed as further validation.The results conclusively demonstrate that PKEBA ex-hibits superior convergence and precision compared to existing algorithms.展开更多
The advent of microgrids in modern energy systems heralds a promising era of resilience,sustainability,and efficiency.Within the realm of grid-tied microgrids,the selection of an optimal optimization algorithm is crit...The advent of microgrids in modern energy systems heralds a promising era of resilience,sustainability,and efficiency.Within the realm of grid-tied microgrids,the selection of an optimal optimization algorithm is critical for effective energy management,particularly in economic dispatching.This study compares the performance of Particle Swarm Optimization(PSO)and Genetic Algorithms(GA)in microgrid energy management systems,implemented using MATLAB tools.Through a comprehensive review of the literature and sim-ulations conducted in MATLAB,the study analyzes performance metrics,convergence speed,and the overall efficacy of GA and PSO,with a focus on economic dispatching tasks.Notably,a significant distinction emerges between the cost curves generated by the two algo-rithms for microgrid operation,with the PSO algorithm consistently resulting in lower costs due to its effective economic dispatching capabilities.Specifically,the utilization of the PSO approach could potentially lead to substantial savings on the power bill,amounting to approximately$15.30 in this evaluation.Thefindings provide insights into the strengths and limitations of each algorithm within the complex dynamics of grid-tied microgrids,thereby assisting stakeholders and researchers in arriving at informed decisions.This study contributes to the discourse on sustainable energy management by offering actionable guidance for the advancement of grid-tied micro-grid technologies through MATLAB-implemented optimization algorithms.展开更多
Nowadays,abnormal traffic detection for Software-Defined Networking(SDN)faces the challenges of large data volume and high dimensionality.Since traditional machine learning-based detection methods have the problem of ...Nowadays,abnormal traffic detection for Software-Defined Networking(SDN)faces the challenges of large data volume and high dimensionality.Since traditional machine learning-based detection methods have the problem of data redundancy,the Metaheuristic Algorithm(MA)is introduced to select features beforemachine learning to reduce the dimensionality of data.Since a Tyrannosaurus Optimization Algorithm(TROA)has the advantages of few parameters,simple implementation,and fast convergence,and it shows better results in feature selection,TROA can be applied to abnormal traffic detection for SDN.However,TROA suffers frominsufficient global search capability,is easily trapped in local optimums,and has poor search accuracy.Then,this paper tries to improve TROA,namely the Improved Tyrannosaurus Optimization Algorithm(ITROA).It proposes a metaheuristic-driven abnormal traffic detection model for SDN based on ITROA.Finally,the validity of the ITROA is verified by the benchmark function and the UCI dataset,and the feature selection optimization operation is performed on the InSDN dataset by ITROA and other MAs to obtain the optimized feature subset for SDN abnormal traffic detection.The experiment shows that the performance of the proposed ITROA outperforms compared MAs in terms of the metaheuristic-driven model for SDN,achieving an accuracy of 99.37%on binary classification and 96.73%on multiclassification.展开更多
This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal ...This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal allocation and sizing of multiple PV-DG units to minimize power losses using a probabilistic PV model and time-series power flow analysis. Addressing the uncertainties in PV output due to weather variability and diurnal cycles is critical. A probabilistic assessment offers a more robust analysis of DG integration’s impact on the grid, potentially leading to more reliable system planning. The presented approach employs a genetic algorithm (GA) and a determined PV output profile and probabilistic PV generation profile based on experimental measurements for one year of solar radiation in Cairo, Egypt. The proposed algorithms are validated using a co-simulation framework that integrates MATLAB and OpenDSS, enabling analysis on a 33-bus test system. This framework can act as a guideline for creating other co-simulation algorithms to enhance computing platforms for contemporary modern distribution systems within smart grids concept. The paper presents comparisons with previous research studies and various interesting findings such as the considered hours for developing the probabilistic model presents different results.展开更多
Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their sim...Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.展开更多
基金supported by the National Natural Science Foundation of China(NSFC)under Grant(No.51677058).
文摘Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.
基金supported by Yunnan Provincial Basic Research Project(202401AT070344,202301AT070443)National Natural Science Foundation of China(62263014,52207105)+1 种基金Yunnan Lancang-Mekong International Electric Power Technology Joint Laboratory(202203AP140001)Major Science and Technology Projects in Yunnan Province(202402AG050006).
文摘Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.
基金supported by Science and Technology Innovation Programfor Postgraduate Students in IDP Subsidized by Fundamental Research Funds for the Central Universities(Project No.ZY20240335)support of the Research Project of the Key Technology of Malicious Code Detection Based on Data Mining in APT Attack(Project No.2022IT173)the Research Project of the Big Data Sensitive Information Supervision Technology Based on Convolutional Neural Network(Project No.2022011033).
文摘Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.
文摘Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design.
文摘To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks.
基金Supported by Beijing Hospitals Authority Youth Programme,No.QML20200505.
文摘BACKGROUND Esophageal squamous cell carcinoma is a major histological subtype of esophageal cancer.Many molecular genetic changes are associated with its occurrence.Raman spectroscopy has become a new method for the early diagnosis of tumors because it can reflect the structures of substances and their changes at the molecular level.AIM To detect alterations in Raman spectral information across different stages of esophageal neoplasia.METHODS Different grades of esophageal lesions were collected,and a total of 360 groups of Raman spectrum data were collected.A 1D-transformer network model was proposed to handle the task of classifying the spectral data of esophageal squamous cell carcinoma.In addition,a deep learning model was applied to visualize the Raman spectral data and interpret their molecular characteristics.RESULTS A comparison among Raman spectral data with different pathological grades and a visual analysis revealed that the Raman peaks with significant differences were concentrated mainly at 1095 cm^(-1)(DNA,symmetric PO,and stretching vibration),1132 cm^(-1)(cytochrome c),1171 cm^(-1)(acetoacetate),1216 cm^(-1)(amide III),and 1315 cm^(-1)(glycerol).A comparison among the training results of different models revealed that the 1Dtransformer network performed best.A 93.30%accuracy value,a 96.65%specificity value,a 93.30%sensitivity value,and a 93.17%F1 score were achieved.CONCLUSION Raman spectroscopy revealed significantly different waveforms for the different stages of esophageal neoplasia.The combination of Raman spectroscopy and deep learning methods could significantly improve the accuracy of classification.
基金Shanxi Province Higher Education Science and Technology Innovation Fund Project(2022-676)Shanxi Soft Science Program Research Fund Project(2016041008-6)。
文摘In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.
基金co-supported by the Foundation of Shanghai Astronautics Science and Technology Innovation,China(No.SAST2022-114)the National Natural Science Foundation of China(No.62303378),the National Natural Science Foundation of China(Nos.124B2031,12202281)the Foundation of China National Key Laboratory of Science and Technology on Test Physics&Numerical Mathematics,China(No.08-YY-2023-R11)。
文摘The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making collision avoidance significantly more challenging than that for space debris.Much existing research focuses on the continuous thrust model,whereas the impulsive maneuver model is more appropriate for long-duration and long-distance avoidance missions.Additionally,it is important to minimize the impact on the original mission while avoiding noncooperative targets.On the other hand,the existing avoidance algorithms are computationally complex and time-consuming especially with the limited computing capability of the on-board computer,posing challenges for practical engineering applications.To conquer these difficulties,this paper makes the following key contributions:(A)a turn-based(sequential decision-making)limited-area impulsive collision avoidance model considering the time delay of precision orbit determination is established for the first time;(B)a novel Selection Probability Learning Adaptive Search-depth Search Tree(SPL-ASST)algorithm is proposed for non-cooperative target avoidance,which improves the decision-making efficiency by introducing an adaptive-search-depth mechanism and a neural network into the traditional Monte Carlo Tree Search(MCTS).Numerical simulations confirm the effectiveness and efficiency of the proposed method.
文摘The improved cross-correlation algorithm for the strain demodulation of Vernier-effect-based optical fiber sensor(VE-OFS)is proposed in this article.The algorithm identifies the most similar spectrum to the measured one from the database of the collected spectra by employing the cross-correlation operation,subsequently deriving the predicted value via weighted calculation.As the algorithm uses the complete information in the measured raw spectrum,more accurate results and larger measurement range can be obtained.Additionally,the improved cross-correlation algorithm also has the potential to improve the measurement speed compared to current standards due to the possibility for the collection using low sampling rate.This work presents an important algorithm towards a simpler,faster way to improve the demodulation performance of VE-OFS.
基金received funding from the Postgraduate Research&Practice Innovation Program of Jiangsu Province(SJCX23_1633)2023 University Student Innovation and Entrepreneurship Training Program(202311463009Z)+1 种基金Changzhou Science and Technology Support Project(CE20235045)Open Project of Jiangsu Key Laboratory of Power Transmission&Distribution Equipment Technology(2021JSSPD12).
文摘Uneven power distribution,transient voltage,and frequency deviations are observed in the photovoltaic storage hybrid inverter during the switching between grid-connected and island modes.In response to these issues,this paper proposes a grid-connected/island switching control strategy for photovoltaic storage hybrid inverters based on the modified chimpanzee optimization algorithm.The proposed strategy incorporates coupling compensation and power differentiation elements based on the traditional droop control.Then,it combines the angular frequency and voltage amplitude adjustments provided by the phase-locked loop-free pre-synchronization control strategy.Precise pre-synchronization is achieved by regulating the virtual current to zero and aligning the photovoltaic storage hybrid inverter with the grid voltage.Additionally,two novel operators,learning and emotional behaviors are introduced to enhance the optimization precision of the chimpanzee algorithm.These operators ensure high-precision and high-reliability optimization of the droop control parameters for photovoltaic storage hybrid inverters.A Simulink model was constructed for simulation analysis,which validated the optimized control strategy’s ability to evenly distribute power under load transients.This strategy effectively mitigated transient voltage and current surges during mode transitions.Consequently,seamless and efficient switching between gridconnected and island modes was achieved for the photovoltaic storage hybrid inverter.The enhanced energy utilization efficiency,in turn,offers robust technical support for grid stability.
基金National Key Research and Development Program(2021YFB2900604)。
文摘Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To satisfy quality of service(QoS)requirements of various users,it is critical to research efficient routing strategies to fully utilize satellite resources.This paper proposes a multi-QoS information optimized routing algorithm based on reinforcement learning for LEO satellite networks,which guarantees high level assurance demand services to be prioritized under limited satellite resources while considering the load balancing performance of the satellite networks for low level assurance demand services to ensure the full and effective utilization of satellite resources.An auxiliary path search algorithm is proposed to accelerate the convergence of satellite routing algorithm.Simulation results show that the generated routing strategy can timely process and fully meet the QoS demands of high assurance services while effectively improving the load balancing performance of the link.
基金Supported by the Major Science and Technology Project of Jilin Province(20220301010GX)the International Scientific and Technological Cooperation(20240402071GH).
文摘The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade temperature regulation performance.To address these challenges,we propose a composite control scheme combining fuzzy logic and a variable-gain generalized supertwisting algorithm(VG-GSTA).Firstly,a one-dimensional(1D)fuzzy logic controler(FLC)for the pump ensures stable coolant flow,while a two-dimensional(2D)FLC for the fan regulates the stack temperature near the reference value.The VG-GSTA is then introduced to eliminate steady-state errors,offering resistance to disturbances and minimizing control oscillations.The equilibrium optimizer is used to fine-tune VG-GSTA parameters.Co-simulation verifies the effectiveness of our method,demonstrating its advantages in terms of disturbance immunity,overshoot suppression,tracking accuracy and response speed.
基金supported in part by the National Natural Science Foundation of China under Grant No.62271453in part by the National Natural Science Foundation of China No.62101512+2 种基金in part by the Central Support for Local Projects under Grant No.YDZJSX2024D031in part by Project supported by the Shanxi Provincial Foundation for Leaders of Disciplines in Science,China under Grant No.2024Q022in part by Shanxi Province Patent Conversion Special Plan Funding Projects under Grant No.202405004。
文摘In the deployment of wireless networks in two-dimensional outdoor campus spaces,aiming at the problem of efficient coverage of the monitoring area by limited number of access points(APs),this paper proposes a deployment method of multi-objective optimization with virtual force fusion bat algorithm(VFBA)using the classical four-node regular distribution as an entry point.The introduction of Lévy flight strategy for bat position updating helps to maintain the population diversity,reduce the premature maturity problem caused by population convergence,avoid the over aggregation of individuals in the local optimal region,and enhance the superiority in global search;the virtual force algorithm simulates the attraction and repulsion between individuals,which enables individual bats to precisely locate the optimal solution within the search space.At the same time,the fusion effect of virtual force prompts the bat individuals to move faster to the potential optimal solution.To validate the effectiveness of the fusion algorithm,the benchmark test function is selected for simulation testing.Finally,the simulation result verifies that the VFBA achieves superior coverage and effectively reduces node redundancy compared to the other three regular layout methods.The VFBA also shows better coverage results when compared to other optimization algorithms.
基金Supported by the EDD of China(No.80912020104)the Science and Technology Commission of Shanghai Municipality(No.22ZR1427700 and No.23692106900).
文摘The traditional A^(*)algorithm exhibits a low efficiency in the path planning of unmanned surface vehicles(USVs).In addition,the path planned presents numerous redundant inflection waypoints,and the security is low,which is not conducive to the control of USV and also affects navigation safety.In this paper,these problems were addressed through the following improvements.First,the path search angle and security were comprehensively considered,and a security expansion strategy of nodes based on the 5×5 neighborhood was proposed.The A^(*)algorithm search neighborhood was expanded from 3×3 to 5×5,and safe nodes were screened out for extension via the node security expansion strategy.This algorithm can also optimize path search angles while improving path security.Second,the distance from the current node to the target node was introduced into the heuristic function.The efficiency of the A^(*)algorithm was improved,and the path was smoothed using the Floyd algorithm.For the dynamic adjustment of the weight to improve the efficiency of DWA,the distance from the USV to the target point was introduced into the evaluation function of the dynamic-window approach(DWA)algorithm.Finally,combined with the local target point selection strategy,the optimized DWA algorithm was performed for local path planning.The experimental results show the smooth and safe path planned by the fusion algorithm,which can successfully avoid dynamic obstacles and is effective and feasible in path planning for USVs.
文摘This study evaluates the undrained uplift capacity of open-caisson anchors embedded in anisotropic clay using Finite Element Limit Analysis(FELA)and a hybrid machine learning framework.The FELA simulations inves-tigate the influence of the radius ratio(R/B),anisotropic ratio(re),interface roughness factor(α),and inclination angle(β).Specifically,the results reveal that increasingβsignificantly enhances Nc,especially as soil behavior approaches isotropy.Higherαimproves resistance at steeper inclinations by mobilizing greater interface shear.Nc increases with re,reflecting enhanced strength under isotropic conditions.To enhance predictive accuracy and generalization,a hybrid machine learning model was developed by integrating Extreme Gradient Boosting(XGBoost)with Genetic Algorithm(GA)and Mutation-Based Genetic Algorithm(MGA)for hyperparameter tuning.Among the models,MGA-XGBoost outperformed GA-XGBoost,achieving higher predictive accuracy(R^(2)=0.996 training,0.993 testing).Furthermore,SHAP analysis consistently identified anisotropic ratio(re)as the most influential factor in predicting uplift capacity,followed by interface roughness factor(α),inclination angle(β),and radius ratio(R/B).The proposed framework serves as a scalable decision-support tool adaptable to various soil types and foundation geometries,offering a more efficient and data-driven approach to uplift-resistant design in anisotropic cohesive soils.
基金partially supported by MRC(MC_PC_17171)Royal Society(RP202G0230)+8 种基金BHF(AA/18/3/34220)Hope Foundation for Cancer Research(RM60G0680)GCRF(20P2PF11)Sino-UK Industrial Fund(RP202G0289)LIAS(20P2ED10,20P2RE969)Data Science Enhancement Fund(20P2RE237)Fight for Sight(24NN201)Sino-UK Education Fund(OP202006)BBSRC(RM32G0178B8).
文摘The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and strong global search capabilities,this algorithm finds application across diverse optimization problem domains.However,in the face of increasingly complex optimization challenges,the Bat algorithm encounters certain limitations,such as slow convergence and sensitivity to initial solutions.In order to tackle these challenges,the present study incorporates a range of optimization compo-nents into the Bat algorithm,thereby proposing a variant called PKEBA.A projection screening strategy is implemented to mitigate its sensitivity to initial solutions,thereby enhancing the quality of the initial solution set.A kinetic adaptation strategy reforms exploration patterns,while an elite communication strategy enhances group interaction,to avoid algorithm from local optima.Subsequently,the effectiveness of the proposed PKEBA is rigorously evaluated.Testing encompasses 30 benchmark functions from IEEE CEC2014,featuring ablation experiments and comparative assessments against classical algorithms and their variants.Moreover,real-world engineering problems are employed as further validation.The results conclusively demonstrate that PKEBA ex-hibits superior convergence and precision compared to existing algorithms.
文摘The advent of microgrids in modern energy systems heralds a promising era of resilience,sustainability,and efficiency.Within the realm of grid-tied microgrids,the selection of an optimal optimization algorithm is critical for effective energy management,particularly in economic dispatching.This study compares the performance of Particle Swarm Optimization(PSO)and Genetic Algorithms(GA)in microgrid energy management systems,implemented using MATLAB tools.Through a comprehensive review of the literature and sim-ulations conducted in MATLAB,the study analyzes performance metrics,convergence speed,and the overall efficacy of GA and PSO,with a focus on economic dispatching tasks.Notably,a significant distinction emerges between the cost curves generated by the two algo-rithms for microgrid operation,with the PSO algorithm consistently resulting in lower costs due to its effective economic dispatching capabilities.Specifically,the utilization of the PSO approach could potentially lead to substantial savings on the power bill,amounting to approximately$15.30 in this evaluation.Thefindings provide insights into the strengths and limitations of each algorithm within the complex dynamics of grid-tied microgrids,thereby assisting stakeholders and researchers in arriving at informed decisions.This study contributes to the discourse on sustainable energy management by offering actionable guidance for the advancement of grid-tied micro-grid technologies through MATLAB-implemented optimization algorithms.
基金supported by the National Natural Science Foundation of China under Grant 61602162the Hubei Provincial Science and Technology Plan Project under Grant 2023BCB041.
文摘Nowadays,abnormal traffic detection for Software-Defined Networking(SDN)faces the challenges of large data volume and high dimensionality.Since traditional machine learning-based detection methods have the problem of data redundancy,the Metaheuristic Algorithm(MA)is introduced to select features beforemachine learning to reduce the dimensionality of data.Since a Tyrannosaurus Optimization Algorithm(TROA)has the advantages of few parameters,simple implementation,and fast convergence,and it shows better results in feature selection,TROA can be applied to abnormal traffic detection for SDN.However,TROA suffers frominsufficient global search capability,is easily trapped in local optimums,and has poor search accuracy.Then,this paper tries to improve TROA,namely the Improved Tyrannosaurus Optimization Algorithm(ITROA).It proposes a metaheuristic-driven abnormal traffic detection model for SDN based on ITROA.Finally,the validity of the ITROA is verified by the benchmark function and the UCI dataset,and the feature selection optimization operation is performed on the InSDN dataset by ITROA and other MAs to obtain the optimized feature subset for SDN abnormal traffic detection.The experiment shows that the performance of the proposed ITROA outperforms compared MAs in terms of the metaheuristic-driven model for SDN,achieving an accuracy of 99.37%on binary classification and 96.73%on multiclassification.
文摘This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal allocation and sizing of multiple PV-DG units to minimize power losses using a probabilistic PV model and time-series power flow analysis. Addressing the uncertainties in PV output due to weather variability and diurnal cycles is critical. A probabilistic assessment offers a more robust analysis of DG integration’s impact on the grid, potentially leading to more reliable system planning. The presented approach employs a genetic algorithm (GA) and a determined PV output profile and probabilistic PV generation profile based on experimental measurements for one year of solar radiation in Cairo, Egypt. The proposed algorithms are validated using a co-simulation framework that integrates MATLAB and OpenDSS, enabling analysis on a 33-bus test system. This framework can act as a guideline for creating other co-simulation algorithms to enhance computing platforms for contemporary modern distribution systems within smart grids concept. The paper presents comparisons with previous research studies and various interesting findings such as the considered hours for developing the probabilistic model presents different results.
文摘Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.