The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow S...The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.展开更多
Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications...Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.展开更多
One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast c...One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.展开更多
Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. ...Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.展开更多
Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A...Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.展开更多
Since the increasing demand for surgeries in hospitals,the surgery scheduling problems have attracted extensive attention.This study focuses on solving a surgery scheduling problem with setup time.First a mathematical...Since the increasing demand for surgeries in hospitals,the surgery scheduling problems have attracted extensive attention.This study focuses on solving a surgery scheduling problem with setup time.First a mathematical model is created to minimize the maximum completion time(makespan)of all surgeres and patient waiting time,simultaneously.The time by the fatigue effect is included in the surgery time,which is caused by doctors’long working time.Second,four mate-heuristics are optimized to address the relevant problems.Three novel strategies are designed to improve the quality of the initial solutions.To improve the convergence of the algorithms,seven local search operators are proposed based on the characteristics of the surgery scheduling problems.Third,Q-learning is used to dynamically choose the optimal local search operator for the current state in each iteration.Finally,by comparing the experimental results of 30 instances,the Q.learning based local search strategy's effectiveness is verified.Among all the compared algorithms,the improved artificial bee colony(ABC)with Q-leaming based local search has the best competiiveness.展开更多
Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required f...Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required for geotechnical site investigations.This study introduces a hybrid virus colony search(VCS)algorithm that integrates the standard VCS algorithm with a mutation-based search mechanism to develop high-performance XGBoost learning models to address this limitation.A dataset of 372 seismic CPTu and corresponding soil physical properties data from 26 geotechnical projects in Jiangs_(u)Province,China,was collected for model development.Comparative evaluations demonstrate that the proposed hybrid VCS-XGBoost model exhibits s_(u)perior performance compared to standard meta-heuristic algorithm-based XGBoost models.The res_(u)lts highlight that the consideration of soil physical properties significantly improves the predictive accuracy of s_(u),emphasizing the importance of considering additional soil information beyond CPTu data for accurate s_(u)estimation.展开更多
The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract ...The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.展开更多
Parameter extraction of photovoltaic(PV)models is crucial for the planning,optimization,and control of PV systems.Although some methods using meta-heuristic algorithms have been proposed to determine these parameters,...Parameter extraction of photovoltaic(PV)models is crucial for the planning,optimization,and control of PV systems.Although some methods using meta-heuristic algorithms have been proposed to determine these parameters,the robustness of solutions obtained by these methods faces great challenges when the complexity of the PV model increases.The unstable results will affect the reliable operation and maintenance strategies of PV systems.In response to this challenge,an improved rime optimization algorithm with enhanced exploration and exploitation,termed TERIME,is proposed for robust and accurate parameter identification for various PV models.Specifically,the differential evolution mutation operator is integrated in the exploration phase to enhance the population diversity.Meanwhile,a new exploitation strategy incorporating randomization and neighborhood strategies simultaneously is developed to maintain the balance of exploitation width and depth.The TERIME algorithm is applied to estimate the optimal parameters of the single diode model,double diode model,and triple diode model combined with the Lambert-W function for three PV cell and module types including RTC France,Photo Watt-PWP 201 and S75.According to the statistical analysis in 100 runs,the proposed algorithm achieves more accurate and robust parameter estimations than other techniques to various PV models in varying environmental conditions.All of our source codes are publicly available at https://github.com/dirge1/TERIME.展开更多
With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heurist...With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm.展开更多
With the rapid development of economy,air pollution caused by industrial expansion has caused serious harm to human health and social development.Therefore,establishing an effective air pollution concentration predict...With the rapid development of economy,air pollution caused by industrial expansion has caused serious harm to human health and social development.Therefore,establishing an effective air pollution concentration prediction system is of great scientific and practical significance for accurate and reliable predictions.This paper proposes a combination of pointinterval prediction system for pollutant concentration prediction by leveraging neural network,meta-heuristic optimization algorithm,and fuzzy theory.Fuzzy information granulation technology is used in data preprocessing to transform numerical sequences into fuzzy particles for comprehensive feature extraction.The golden Jackal optimization algorithm is employed in the optimization stage to fine-tune model hyperparameters.In the prediction stage,an ensemble learning method combines training results frommultiplemodels to obtain final point predictions while also utilizing quantile regression and kernel density estimation methods for interval predictions on the test set.Experimental results demonstrate that the combined model achieves a high goodness of fit coefficient of determination(R^(2))at 99.3% and a maximum difference between prediction accuracy mean absolute percentage error(MAPE)and benchmark model at 12.6%.This suggests that the integrated learning system proposed in this paper can provide more accurate deterministic predictions as well as reliable uncertainty analysis compared to traditionalmodels,offering practical reference for air quality early warning.展开更多
Meta-heuristic evolutionary algorithms have become widely used for solving complex optimization problems.However,their effectiveness in real-world applications is often limited by the need for many evaluations,which c...Meta-heuristic evolutionary algorithms have become widely used for solving complex optimization problems.However,their effectiveness in real-world applications is often limited by the need for many evaluations,which can be both costly and time-consuming.This is especially true for large-scale transportation networks,where the size of the problem and the high computational cost can hinder the algorithm’s performance.To address these challenges,recent research has focused on using surrogate-assisted models.These models aim to reduce the number of expensive evaluations and improve the efficiency of solving time-consuming optimization problems.This paper presents a new two-layer Surrogate-Assisted Fish Migration Optimization(SA-FMO)algorithm designed to tackle high-dimensional and computationally heavy problems.The global surrogate model offers a good approximation of the entire problem space,while the local surrogate model focuses on refining the solution near the current best option,improving local optimization.To test the effectiveness of the SA-FMO algorithm,we first conduct experiments using six benchmark functions in a 50-dimensional space.We then apply the algorithm to optimize urban rail transit routes,focusing on the Train Routing Optimization problem.This aims to improve operational efficiency and vehicle turnover in situations with uneven passenger flow during transit disruptions.The results show that SA-FMO can effectively improve optimization outcomes in complex transportation scenarios.展开更多
Cyclic-system-based optimization(CSBO)is an innovative metaheuristic algorithm(MHA)that draws inspiration from the workings of the human blood circulatory system.However,CSBO still faces challenges in solving complex ...Cyclic-system-based optimization(CSBO)is an innovative metaheuristic algorithm(MHA)that draws inspiration from the workings of the human blood circulatory system.However,CSBO still faces challenges in solving complex optimization problems,including limited convergence speed and a propensity to get trapped in local optima.To improve the performance of CSBO further,this paper proposes improved cyclic-system-based optimization(ICSBO).First,in venous blood circulation,an adaptive parameter that changes with evolution is introduced to improve the balance between convergence and diversity in this stage and enhance the exploration of search space.Second,the simplex method strategy is incorporated into the systemic and pulmonary circulations,which improves the update formulas.A learning strategy aimed at the optimal individual,combined with a straightforward opposition-based learning approach,is employed to enhance population convergence while preserving diversity.Finally,a novel external archive utilizing a diversity supplementation mechanism is introduced to enhance population diversity,maximize the use of superior genes,and lower the risk of the population being trapped in local optima.Testing on the CEC2017 benchmark set shows that compared with the original CSBO and eight other outstanding MHAs,ICSBO demonstrates remarkable advantages in convergence speed,convergence precision,and stability.展开更多
Given the growing concern over global warming and the critical role of carbon dioxide(CO_(2))in this phenomenon,the study of CO_(2)-induced alterations in coal strength has garnered significant attention due to its im...Given the growing concern over global warming and the critical role of carbon dioxide(CO_(2))in this phenomenon,the study of CO_(2)-induced alterations in coal strength has garnered significant attention due to its implications for carbon sequestration.A large number of experiments have proved that CO_(2) interaction time(T),saturation pressure(P)and other parameters have significant effects on coal strength.However,accurate evaluation of CO_(2)-induced alterations in coal strength is still a difficult problem,so it is particularly important to establish accurate and efficient prediction models.This study explored the application of advancedmachine learning(ML)algorithms and Gene Expression Programming(GEP)techniques to predict CO_(2)-induced alterations in coal strength.Sixmodels were developed,including three metaheuristic-optimized XGBoost models(GWO-XGBoost,SSA-XGBoost,PO-XGBoost)and three GEP models(GEP-1,GEP-2,GEP-3).Comprehensive evaluations using multiple metrics revealed that all models demonstrated high predictive accuracy,with the SSA-XGBoost model achieving the best performance(R2—Coefficient of determination=0.99396,RMSE—Root Mean Square Error=0.62102,MAE—Mean Absolute Error=0.36164,MAPE—Mean Absolute Percentage Error=4.8101%,RPD—Residual Predictive Deviation=13.4741).Model interpretability analyses using SHAP(Shapley Additive exPlanations),ICE(Individual Conditional Expectation),and PDP(Partial Dependence Plot)techniques highlighted the dominant role of fixed carbon content(FC)and significant interactions between FC and CO_(2) saturation pressure(P).Theresults demonstrated that the proposedmodels effectively address the challenges of CO_(2)-induced strength prediction,providing valuable insights for geological storage safety and environmental applications.展开更多
Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their sim...Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.展开更多
This study examines various issues arising in three-phase unbalanced power distribution networks(PDNs)using a comprehensive optimization approach.With the integration of renewable energy sources,increasing energy dema...This study examines various issues arising in three-phase unbalanced power distribution networks(PDNs)using a comprehensive optimization approach.With the integration of renewable energy sources,increasing energy demands,and the adoption of smart grid technologies,power systems are undergoing a rapid transformation,making the need for efficient,reliable,and sustainable distribution networks increasingly critical.In this paper,the reconfiguration problem in a 37-bus unbalanced PDN test system is solved using five different popular metaheuristic algorithms.Among these advanced search algorithms,the Bonobo Optimizer(BO)has demonstrated superior performance in handling the complexities of unbalanced power distribution network optimization.The study is structured around four distinct scenarios:(Ⅰ)improving mean voltage profile and minimizing active power loss,(Ⅱ)minimizing Voltage Unbalance Index(VUI)and Current Unbalance Index(CUI),(Ⅲ)optimizing key reliability indices using both Line Oriented Reliability Index(LORI)and Customer Oriented Reliability Index(CORI)approaches,and(Ⅳ)employing multi-objective optimization using the Pareto front technique to simultaneously minimize active power loss,average CUI,and System Average Interruption Duration Index(SAIDI).The study aims to contribute to the development of more efficient,reliable,and sustainable energy systems by addressing voltage profiles,power losses,reduction of imbalance,and the enhancement of reliability together.展开更多
Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identific...Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.展开更多
Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swa...Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.展开更多
基金partially supported by the Guangdong Basic and Applied Basic Research Foundation(2023A1515011531)the National Natural Science Foundation of China under Grant 62173356+2 种基金the Science and Technology Development Fund(FDCT),Macao SAR,under Grant 0019/2021/AZhuhai Industry-University-Research Project with Hongkong and Macao under Grant ZH22017002210014PWCthe Key Technologies for Scheduling and Optimization of Complex Distributed Manufacturing Systems(22JR10KA007).
文摘The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.
文摘Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.
文摘One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.
文摘Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.
基金supported by the Center for Mining,Electro-Mechanical research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnamfinancially supported by the Hunan Provincial Department of Education General Project(19C1744)+1 种基金Hunan Province Science Foundation for Youth Scholars of China fund(2018JJ3510)the Innovation-Driven Project of Central South University(2020CX040)。
文摘Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.
基金supported by the National Natural Science Foundation of China under Grant 62173356the Science and Technology Development Fund(FDCT),Macao,China,under Grant 0019/2021/A,Zhuhai Industry-University-Research Project with Hong Kong and Macao under Grant ZH22017002210014PWC,the Guangdong Basic and Applied Basic Research Foundation(2023A1515011531)research on the Key Technologies for Scheduling and Optimization of Complex Distributed Manufacturing Systems(22JR10KA007).
文摘Since the increasing demand for surgeries in hospitals,the surgery scheduling problems have attracted extensive attention.This study focuses on solving a surgery scheduling problem with setup time.First a mathematical model is created to minimize the maximum completion time(makespan)of all surgeres and patient waiting time,simultaneously.The time by the fatigue effect is included in the surgery time,which is caused by doctors’long working time.Second,four mate-heuristics are optimized to address the relevant problems.Three novel strategies are designed to improve the quality of the initial solutions.To improve the convergence of the algorithms,seven local search operators are proposed based on the characteristics of the surgery scheduling problems.Third,Q-learning is used to dynamically choose the optimal local search operator for the current state in each iteration.Finally,by comparing the experimental results of 30 instances,the Q.learning based local search strategy's effectiveness is verified.Among all the compared algorithms,the improved artificial bee colony(ABC)with Q-leaming based local search has the best competiiveness.
基金funded by the National Science Fund for Distinguished Young Scholars(Grant No.42225206)the National Key R&D Program of China(Grant No.2020YFC1807200)the National Natural Science Foundation of China(Grant No.42072299).
文摘Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required for geotechnical site investigations.This study introduces a hybrid virus colony search(VCS)algorithm that integrates the standard VCS algorithm with a mutation-based search mechanism to develop high-performance XGBoost learning models to address this limitation.A dataset of 372 seismic CPTu and corresponding soil physical properties data from 26 geotechnical projects in Jiangs_(u)Province,China,was collected for model development.Comparative evaluations demonstrate that the proposed hybrid VCS-XGBoost model exhibits s_(u)perior performance compared to standard meta-heuristic algorithm-based XGBoost models.The res_(u)lts highlight that the consideration of soil physical properties significantly improves the predictive accuracy of s_(u),emphasizing the importance of considering additional soil information beyond CPTu data for accurate s_(u)estimation.
文摘The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.
基金supported by the National Natural Science Foundation of China[grant number 51775020]the Science Challenge Project[grant number.TZ2018007]+2 种基金the National Natural Science Foundation of China[grant number 62073009]the Postdoctoral Fellowship Program of CPSF[grant number GZC20233365]the Fundamental Research Funds for Central Universities[grant number JKF-20240559].
文摘Parameter extraction of photovoltaic(PV)models is crucial for the planning,optimization,and control of PV systems.Although some methods using meta-heuristic algorithms have been proposed to determine these parameters,the robustness of solutions obtained by these methods faces great challenges when the complexity of the PV model increases.The unstable results will affect the reliable operation and maintenance strategies of PV systems.In response to this challenge,an improved rime optimization algorithm with enhanced exploration and exploitation,termed TERIME,is proposed for robust and accurate parameter identification for various PV models.Specifically,the differential evolution mutation operator is integrated in the exploration phase to enhance the population diversity.Meanwhile,a new exploitation strategy incorporating randomization and neighborhood strategies simultaneously is developed to maintain the balance of exploitation width and depth.The TERIME algorithm is applied to estimate the optimal parameters of the single diode model,double diode model,and triple diode model combined with the Lambert-W function for three PV cell and module types including RTC France,Photo Watt-PWP 201 and S75.According to the statistical analysis in 100 runs,the proposed algorithm achieves more accurate and robust parameter estimations than other techniques to various PV models in varying environmental conditions.All of our source codes are publicly available at https://github.com/dirge1/TERIME.
基金supported by Prince Sattam bin Abdulaziz University for funding this research work through the project number(2024/RV/06).
文摘With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm.
基金supported by General Scientific Research Funding of the Science and Technology Development Fund(FDCT)in Macao(No.0150/2022/A)the Faculty Research Grants of Macao University of Science and Technology(No.FRG-22-074-FIE).
文摘With the rapid development of economy,air pollution caused by industrial expansion has caused serious harm to human health and social development.Therefore,establishing an effective air pollution concentration prediction system is of great scientific and practical significance for accurate and reliable predictions.This paper proposes a combination of pointinterval prediction system for pollutant concentration prediction by leveraging neural network,meta-heuristic optimization algorithm,and fuzzy theory.Fuzzy information granulation technology is used in data preprocessing to transform numerical sequences into fuzzy particles for comprehensive feature extraction.The golden Jackal optimization algorithm is employed in the optimization stage to fine-tune model hyperparameters.In the prediction stage,an ensemble learning method combines training results frommultiplemodels to obtain final point predictions while also utilizing quantile regression and kernel density estimation methods for interval predictions on the test set.Experimental results demonstrate that the combined model achieves a high goodness of fit coefficient of determination(R^(2))at 99.3% and a maximum difference between prediction accuracy mean absolute percentage error(MAPE)and benchmark model at 12.6%.This suggests that the integrated learning system proposed in this paper can provide more accurate deterministic predictions as well as reliable uncertainty analysis compared to traditionalmodels,offering practical reference for air quality early warning.
基金supported by the National Natural Science Foundation of China(Project No.52172321,52102391)Sichuan Province Science and Technology Innovation Talent Project(2024JDRC0020)+1 种基金China Shenhua Energy Company Limited Technology Project(GJNY-22-7/2300-K1220053)Key science and technology projects in the transportation industry of the Ministry of Transport(2022-ZD7-132).
文摘Meta-heuristic evolutionary algorithms have become widely used for solving complex optimization problems.However,their effectiveness in real-world applications is often limited by the need for many evaluations,which can be both costly and time-consuming.This is especially true for large-scale transportation networks,where the size of the problem and the high computational cost can hinder the algorithm’s performance.To address these challenges,recent research has focused on using surrogate-assisted models.These models aim to reduce the number of expensive evaluations and improve the efficiency of solving time-consuming optimization problems.This paper presents a new two-layer Surrogate-Assisted Fish Migration Optimization(SA-FMO)algorithm designed to tackle high-dimensional and computationally heavy problems.The global surrogate model offers a good approximation of the entire problem space,while the local surrogate model focuses on refining the solution near the current best option,improving local optimization.To test the effectiveness of the SA-FMO algorithm,we first conduct experiments using six benchmark functions in a 50-dimensional space.We then apply the algorithm to optimize urban rail transit routes,focusing on the Train Routing Optimization problem.This aims to improve operational efficiency and vehicle turnover in situations with uneven passenger flow during transit disruptions.The results show that SA-FMO can effectively improve optimization outcomes in complex transportation scenarios.
基金supported by the Project of Scientific and Technological Innovation Development of Jilin in China under Grant 20210103090.
文摘Cyclic-system-based optimization(CSBO)is an innovative metaheuristic algorithm(MHA)that draws inspiration from the workings of the human blood circulatory system.However,CSBO still faces challenges in solving complex optimization problems,including limited convergence speed and a propensity to get trapped in local optima.To improve the performance of CSBO further,this paper proposes improved cyclic-system-based optimization(ICSBO).First,in venous blood circulation,an adaptive parameter that changes with evolution is introduced to improve the balance between convergence and diversity in this stage and enhance the exploration of search space.Second,the simplex method strategy is incorporated into the systemic and pulmonary circulations,which improves the update formulas.A learning strategy aimed at the optimal individual,combined with a straightforward opposition-based learning approach,is employed to enhance population convergence while preserving diversity.Finally,a novel external archive utilizing a diversity supplementation mechanism is introduced to enhance population diversity,maximize the use of superior genes,and lower the risk of the population being trapped in local optima.Testing on the CEC2017 benchmark set shows that compared with the original CSBO and eight other outstanding MHAs,ICSBO demonstrates remarkable advantages in convergence speed,convergence precision,and stability.
基金partially supported by the National Natural Science Foundation of China(42177164,52474121)the Outstanding Youth Project of Hunan Provincial Department of Education(23B0008).
文摘Given the growing concern over global warming and the critical role of carbon dioxide(CO_(2))in this phenomenon,the study of CO_(2)-induced alterations in coal strength has garnered significant attention due to its implications for carbon sequestration.A large number of experiments have proved that CO_(2) interaction time(T),saturation pressure(P)and other parameters have significant effects on coal strength.However,accurate evaluation of CO_(2)-induced alterations in coal strength is still a difficult problem,so it is particularly important to establish accurate and efficient prediction models.This study explored the application of advancedmachine learning(ML)algorithms and Gene Expression Programming(GEP)techniques to predict CO_(2)-induced alterations in coal strength.Sixmodels were developed,including three metaheuristic-optimized XGBoost models(GWO-XGBoost,SSA-XGBoost,PO-XGBoost)and three GEP models(GEP-1,GEP-2,GEP-3).Comprehensive evaluations using multiple metrics revealed that all models demonstrated high predictive accuracy,with the SSA-XGBoost model achieving the best performance(R2—Coefficient of determination=0.99396,RMSE—Root Mean Square Error=0.62102,MAE—Mean Absolute Error=0.36164,MAPE—Mean Absolute Percentage Error=4.8101%,RPD—Residual Predictive Deviation=13.4741).Model interpretability analyses using SHAP(Shapley Additive exPlanations),ICE(Individual Conditional Expectation),and PDP(Partial Dependence Plot)techniques highlighted the dominant role of fixed carbon content(FC)and significant interactions between FC and CO_(2) saturation pressure(P).Theresults demonstrated that the proposedmodels effectively address the challenges of CO_(2)-induced strength prediction,providing valuable insights for geological storage safety and environmental applications.
文摘Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.
基金supported by the Scientific and Technological Research Council of Turkey(TUBITAK)under Grant No.124E002(1001-Project).
文摘This study examines various issues arising in three-phase unbalanced power distribution networks(PDNs)using a comprehensive optimization approach.With the integration of renewable energy sources,increasing energy demands,and the adoption of smart grid technologies,power systems are undergoing a rapid transformation,making the need for efficient,reliable,and sustainable distribution networks increasingly critical.In this paper,the reconfiguration problem in a 37-bus unbalanced PDN test system is solved using five different popular metaheuristic algorithms.Among these advanced search algorithms,the Bonobo Optimizer(BO)has demonstrated superior performance in handling the complexities of unbalanced power distribution network optimization.The study is structured around four distinct scenarios:(Ⅰ)improving mean voltage profile and minimizing active power loss,(Ⅱ)minimizing Voltage Unbalance Index(VUI)and Current Unbalance Index(CUI),(Ⅲ)optimizing key reliability indices using both Line Oriented Reliability Index(LORI)and Customer Oriented Reliability Index(CORI)approaches,and(Ⅳ)employing multi-objective optimization using the Pareto front technique to simultaneously minimize active power loss,average CUI,and System Average Interruption Duration Index(SAIDI).The study aims to contribute to the development of more efficient,reliable,and sustainable energy systems by addressing voltage profiles,power losses,reduction of imbalance,and the enhancement of reliability together.
文摘Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.
文摘Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.