The surface correction to the quadrupole source term of the Ffowcs Williams and Hawkings integral in the frequency domain suffers from the computation of high-order derivatives of Green’s function.The far-field appro...The surface correction to the quadrupole source term of the Ffowcs Williams and Hawkings integral in the frequency domain suffers from the computation of high-order derivatives of Green’s function.The far-field approximations to the derivatives of Green’s function have been used without derivation and verification in previous work.In this work,we provide the detailed derivations of the far-field approximations to the derivatives of Green’s function.The binomial expansions for the derivatives of Green’s function and the far-field condition are employed during the derivations to circumvent the difficulties in computing the high-order derivatives.The approximations to the derivatives of Green’s function are systemically verified by using the benchmarks two-dimensional convecting vortex and the co-rotating vortex pair.In addition,we provide the derivations of the approximations to the multiple integrals of Green’s function by using the far-field approximations to the derivatives.展开更多
In response to the increasing global energy demand and environmental pollution,microgrids have emerged as an innovative solution by integrating distributed energy resources(DERs),energy storage systems,and loads to im...In response to the increasing global energy demand and environmental pollution,microgrids have emerged as an innovative solution by integrating distributed energy resources(DERs),energy storage systems,and loads to improve energy efficiency and reliability.This study proposes a novel hybrid optimization algorithm,DE-HHO,combining differential evolution(DE)and Harris Hawks optimization(HHO)to address microgrid scheduling issues.The proposed method adopts a multi-objective optimization framework that simultaneously minimizes operational costs and environmental impacts.The DE-HHO algorithm demonstrates significant advantages in convergence speed and global search capability through the analysis of wind,solar,micro-gas turbine,and battery models.Comprehensive simulation tests show that DE-HHO converges rapidly within 10 iterations and achieves a 4.5%reduction in total cost compared to PSO and a 5.4%reduction compared to HHO.Specifically,DE-HHO attains an optimal total cost of$20,221.37,outperforming PSO($21,184.45)and HHO($21,372.24).The maximum cost obtained by DE-HHO is$23,420.55,with a mean of$21,615.77,indicating stability and cost control capabilities.These results highlight the effectiveness of DE-HHO in reducing operational costs and enhancing system stability for efficient and sustainable microgrid operation.展开更多
Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositori...Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositories results in increased overload due to the adoption of cloud services.Moreover,the migration of applications on the cloud with optimized resource allocation is a herculean task even though it is employed for minimizing the dilemma of allocating resources.In this paper,a Fire Hawk Optimization enabled Deep Learning Scheme(FHOEDLS)is proposed for minimizing the overload and optimizing the resource allocation on the hybrid cloud container architecture for migrating interoperability based applications This FHOEDLS achieves the load prediction through the utilization of deep CNN-GRU-AM model for attaining resource allocation and better migration of applications.It specifically adopted the Fire Hawk Optimization Algorithm(FHOA)for optimizing the parameters that influence the factors that aid in better interoperable application migration with improved resource allocation and minimized overhead.It considered the factors of resource capacity,transmission cost,demand,and predicted load into account during the formulation of the objective function utilized for resource allocation and application migration.The cloud simulation of this FHOEDLS is achieved using a container,Virtual Machine(VM),and Physical Machine(PM).The results of this proposed FHOEDLS confirmed a better resource capability of 0.418 and a minimized load of 0.0061.展开更多
In this work,we apply tunneling formalism to analyze charged particles tunneling across a hairy black hole horizon.Such black hole solutions are essential for frameworks based on Horndeski's gravity theory.Applyin...In this work,we apply tunneling formalism to analyze charged particles tunneling across a hairy black hole horizon.Such black hole solutions are essential for frameworks based on Horndeski's gravity theory.Applying a semi-classical technique,we examine the tunneling of charged particles from a hairy black hole and derive the generic tunneling spectrum of released particles,ignoring self-gravitational and interaction.It is studied to ignore the back-reaction impact of the radiated particle on the hairy black hole.We analyze the properties of the black hole,such as temperature and entropy,under the influence of quantum gravity and also observe that the firstorder correction is present.We study tunneling radiation produced by a charged field equation in the presence of a generalized uncertainty effect.We modify the semi-classical technique by using the generalized uncertainty principle,the WKB approximation,and surface gravity.展开更多
Wind power forecasting plays a crucial role in optimizing the integration of wind energy into the grid by predicting wind patterns and energy output.This enhances the efficiency and reliability of renewable energy sys...Wind power forecasting plays a crucial role in optimizing the integration of wind energy into the grid by predicting wind patterns and energy output.This enhances the efficiency and reliability of renewable energy systems.Forecasting approaches inform energy management strategies,reduce reliance on fossil fuels,and support the broader transition to sustainable energy solutions.The primary goal of this study is to introduce an effective methodology for estimating wind power through temporal data analysis.This research advances an optimized Multilayer Perceptron(MLP)model using recently proposedmetaheuristic optimization algorithms,namely the FireHawk Optimizer(FHO)and the Non-Monopolize Search(NO).A modified version of FHO,termed FHONO,is developed by integrating NO as a local search mechanism to enhance the exploration capability and address the shortcomings of the original FHO.The developed FHONO is then employed to optimize the MLP for enhanced wind power prediction.The effectiveness of the proposed FHONO-MLP model is validated using renowned datasets from wind turbines in France.The results of the comparative analysis between FHONO-MLP,conventionalMLP,and other optimized versions of MLP show that FHONO-MLP outperforms the others,achieving an average RootMean Square Error(RMSE)of 0.105,Mean Absolute Error(MAE)of 0.082,and Coefficient of Determination(R^(2))of 0.967 across all datasets.These findings underscore the significant enhancement in predictive accuracy provided by FHONO and demonstrate its effectiveness in improving wind power forecasting.展开更多
Early detection of Alzheimer’s disease(AD)is crucial,particularly in resource-constrained medical settings.This study introduces an optimized deep learning framework that conceptualizes neural networks as computatio...Early detection of Alzheimer’s disease(AD)is crucial,particularly in resource-constrained medical settings.This study introduces an optimized deep learning framework that conceptualizes neural networks as computational“sensors”for neurodegenerative diagnosis,incorporating feature selection,selective layer unfreezing,pruning,and algorithmic optimization.An enhanced lightweight hybrid DenseNet201 model is proposed,integrating layer pruning strategies for feature selection and bioinspired optimization techniques,including Genetic Algorithm(GA)and Harris Hawks Optimization(HHO),for hyperparameter tuning.Layer pruning helps identify and eliminate less significant features,while model parameter optimization further enhances performance by fine-tuning critical hyperparameters,improving convergence speed,and maximizing classification accuracy.GA is also used to reduce the number of selected features further.A detailed comparison of six AD classification model setups is provided to illustrate the variations and their impact on performance.Applying the lightweight hybrid DenseNet201 model for MRI-based AD classification yielded an impressive baseline F1 score of 98%.Overall feature reduction reached 51.75%,enhancing interpretability and lowering processing costs.The optimized models further demonstrated perfect generalization,achieving 100%classification accuracy.These findings underscore the potential of advanced optimization techniques in developing efficient and accurate AD diagnostic tools suitable for environments with limited computational resources.展开更多
Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urge...Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urgency of demand at disaster-affected sites. Firstly, urgency cost, economic cost, and transportation distance cost were identified as key objectives. The study applied fuzzy theory integration to construct a triangular fuzzy multi-objective site selection decision model. Next, the defuzzification theory transformed the fuzzy decision model into a precise one. Subsequently, an improved Chaotic Quantum Multi-Objective Harris Hawks Optimization (CQ-MOHHO) algorithm was proposed to solve the model. The CQ-MOHHO algorithm was shown to rapidly produce high-quality Pareto front solutions and identify optimal site selection schemes for emergency resource distribution centers through case studies. This outcome verified the feasibility and efficacy of the site selection decision model and the CQ-MOHHO algorithm. To further assess CQ-MOHHO’s performance, Zitzler-Deb-Thiele (ZDT) test functions, commonly used in multi-objective optimization, were employed. Comparisons with Multi-Objective Harris Hawks Optimization (MOHHO), Non-dominated Sorting Genetic Algorithm II (NSGA-II), and Multi-Objective Grey Wolf Optimizer (MOGWO) using Generational Distance (GD), Hypervolume (HV), and Inverted Generational Distance (IGD) metrics showed that CQ-MOHHO achieved superior global search ability, faster convergence, and higher solution quality. The CQ-MOHHO algorithm efficiently achieved a balance between multiple objectives, providing decision-makers with satisfactory solutions and a valuable reference for researching and applying emergency site selection problems.展开更多
Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex syst...Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex system interactions and multiple operational constraints.This study develops a novel Multi-Neighborhood Enhanced Harris Hawks Optimization(MNEHHO)algorithm to address the allocation of HRES components.The proposed approach integrates key technical parameters,including charge-discharge efficiency,storage device configurations,and renewable energy fraction.We formulate a comprehensive mathematical model that simultaneously minimizes levelized energy costs and pollutant emissions while maintaining system reliability.The MNEHHO algorithm employs multiple neighborhood structures to enhance solution diversity and exploration capabilities.The model’s effectiveness is validated through case studies across four distinct institutional energy demand profiles.Results demonstrate that our approach successfully generates practically feasible HRES configurations while achieving significant reductions in costs and emissions compared to conventional methods.The enhanced search mechanisms of MNEHHO show superior performance in avoiding local optima and achieving consistent solutions.Experimental results demonstrate concrete improvements in solution quality(up to 46% improvement in objective value)and computational efficiency(average coefficient of variance of 24%-27%)across diverse institutional settings.This confirms the robustness and scalability of our method under various operational scenarios,providing a reliable framework for solving renewable energy allocation problems.展开更多
AIM:To develop a classifier for traditional Chinese medicine(TCM)syndrome differentiation of diabetic retinopathy(DR),using optimized machine learning algorithms,which can provide the basis for TCM objective and intel...AIM:To develop a classifier for traditional Chinese medicine(TCM)syndrome differentiation of diabetic retinopathy(DR),using optimized machine learning algorithms,which can provide the basis for TCM objective and intelligent syndrome differentiation.METHODS:Collated data on real-world DR cases were collected.A variety of machine learning methods were used to construct TCM syndrome classification model,and the best performance was selected as the basic model.Genetic Algorithm(GA)was used for feature selection to obtain the optimal feature combination.Harris Hawk Optimization(HHO)was used for parameter optimization,and a classification model based on feature selection and parameter optimization was constructed.The performance of the model was compared with other optimization algorithms.The models were evaluated with accuracy,precision,recall,and F1 score as indicators.RESULTS:Data on 970 cases that met screening requirements were collected.Support Vector Machine(SVM)was the best basic classification model.The accuracy rate of the model was 82.05%,the precision rate was 82.34%,the recall rate was 81.81%,and the F1 value was 81.76%.After GA screening,the optimal feature combination contained 37 feature values,which was consistent with TCM clinical practice.The model based on optimal combination and SVM(GA_SVM)had an accuracy improvement of 1.92%compared to the basic classifier.SVM model based on HHO and GA optimization(HHO_GA_SVM)had the best performance and convergence speed compared with other optimization algorithms.Compared with the basic classification model,the accuracy was improved by 3.51%.CONCLUSION:HHO and GA optimization can improve the model performance of SVM in TCM syndrome differentiation of DR.It provides a new method and research idea for TCM intelligent assisted syndrome differentiation.展开更多
In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selec...In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.展开更多
Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been...Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.展开更多
The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In...The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In this study,we investigate thermal geometries and Hawking evaporation of the recently proposed topological dyonic dilaton BH in anti-de Sitter(Ad S)space.We consider Rényi entropy and obtain the relations for pressure,heat capacity and Gibbs free energy and observe that the Rényi parameter and dilaton field play a vital role in the phase transition and stability of the BH.Moreover,we use Weinhold,Ruppeiner and Hendi Panahiyah Eslam Momennia models to evaluate the scalar curvature of the BH and find out that the divergence points of the scalar curvature coincides with the zero of specific heat.Finally,using Stefan–Boltzmann law,we determine that the BH without a dilaton field evaporates far more quickly compared to the dilaton BH in Ad S space.展开更多
In the study of Terrestrial Gamma-ray Flashes (TGFs) and Sonoluminescence, we observe parallels with larger cosmic events. Specifically, sonoluminescence involves the rapid collapse of bubbles, which closely resembles...In the study of Terrestrial Gamma-ray Flashes (TGFs) and Sonoluminescence, we observe parallels with larger cosmic events. Specifically, sonoluminescence involves the rapid collapse of bubbles, which closely resembles gravitational collapse in space. This observation suggests the potential formation of low-density quantum black holes. These entities, which might be related to dark matter, are thought to experience a kind of transient evaporation similar to Hawking radiation seen in cosmic black holes. Consequently, sonoluminescence could be a valuable tool for investigating phenomena typically linked to cosmic scale events. Furthermore, the role of the Higgs boson is considered in this context, possibly connecting it to both TGFs and sonoluminescence. This research could enhance our understanding of the quantum mechanics of black holes and their relation to dark matter on Earth.展开更多
Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble tempe...Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble temperature and the maximum Planck temperature of the expanding universe over the course of cosmic time. This mathematical discovery suggests a re-consideration of Rh=ctcosmological models, including black hole cosmological models, even if it possibly could also be consistent with the Λ-CDM model. Most importantly, this paper contributes to the growing literature in the past year asserting a tightly constrained mathematical relationship between the CMB temperature, the Hubble constant, and other global parameters of the Hubble sphere. Our approach suggests a solid theoretical framework for predicting and understanding the CMB temperature rather than solely observing it.1.展开更多
In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate pr...In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.展开更多
Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).F...Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).Firstly,we used a Gaussian chaotic mapping strategy to initialize the positions of individuals in the population,which enriches the initial individual species characteristics.Secondly,by optimizing the energy parameter and introducing the cosine strategy,the algorithm's ability to jump out of the local optimum is enhanced,which improves the performance of the algorithm.Finally,comparison experiments with other intelligent algorithms were conducted on 13 classical test function sets.The results show that GHHO has better performance in all aspects compared to other optimization algorithms.The improved algorithm is more suitable for generalization to real optimization problems.展开更多
基金National Numerical Windtunnel project,and the National Natural Science Foundation of China(Nos.11922214,91752118).
文摘The surface correction to the quadrupole source term of the Ffowcs Williams and Hawkings integral in the frequency domain suffers from the computation of high-order derivatives of Green’s function.The far-field approximations to the derivatives of Green’s function have been used without derivation and verification in previous work.In this work,we provide the detailed derivations of the far-field approximations to the derivatives of Green’s function.The binomial expansions for the derivatives of Green’s function and the far-field condition are employed during the derivations to circumvent the difficulties in computing the high-order derivatives.The approximations to the derivatives of Green’s function are systemically verified by using the benchmarks two-dimensional convecting vortex and the co-rotating vortex pair.In addition,we provide the derivations of the approximations to the multiple integrals of Green’s function by using the far-field approximations to the derivatives.
文摘In response to the increasing global energy demand and environmental pollution,microgrids have emerged as an innovative solution by integrating distributed energy resources(DERs),energy storage systems,and loads to improve energy efficiency and reliability.This study proposes a novel hybrid optimization algorithm,DE-HHO,combining differential evolution(DE)and Harris Hawks optimization(HHO)to address microgrid scheduling issues.The proposed method adopts a multi-objective optimization framework that simultaneously minimizes operational costs and environmental impacts.The DE-HHO algorithm demonstrates significant advantages in convergence speed and global search capability through the analysis of wind,solar,micro-gas turbine,and battery models.Comprehensive simulation tests show that DE-HHO converges rapidly within 10 iterations and achieves a 4.5%reduction in total cost compared to PSO and a 5.4%reduction compared to HHO.Specifically,DE-HHO attains an optimal total cost of$20,221.37,outperforming PSO($21,184.45)and HHO($21,372.24).The maximum cost obtained by DE-HHO is$23,420.55,with a mean of$21,615.77,indicating stability and cost control capabilities.These results highlight the effectiveness of DE-HHO in reducing operational costs and enhancing system stability for efficient and sustainable microgrid operation.
文摘Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositories results in increased overload due to the adoption of cloud services.Moreover,the migration of applications on the cloud with optimized resource allocation is a herculean task even though it is employed for minimizing the dilemma of allocating resources.In this paper,a Fire Hawk Optimization enabled Deep Learning Scheme(FHOEDLS)is proposed for minimizing the overload and optimizing the resource allocation on the hybrid cloud container architecture for migrating interoperability based applications This FHOEDLS achieves the load prediction through the utilization of deep CNN-GRU-AM model for attaining resource allocation and better migration of applications.It specifically adopted the Fire Hawk Optimization Algorithm(FHOA)for optimizing the parameters that influence the factors that aid in better interoperable application migration with improved resource allocation and minimized overhead.It considered the factors of resource capacity,transmission cost,demand,and predicted load into account during the formulation of the objective function utilized for resource allocation and application migration.The cloud simulation of this FHOEDLS is achieved using a container,Virtual Machine(VM),and Physical Machine(PM).The results of this proposed FHOEDLS confirmed a better resource capability of 0.418 and a minimized load of 0.0061.
基金funded by the National Natural Science Foundation of China under Grant No.11975145。
文摘In this work,we apply tunneling formalism to analyze charged particles tunneling across a hairy black hole horizon.Such black hole solutions are essential for frameworks based on Horndeski's gravity theory.Applying a semi-classical technique,we examine the tunneling of charged particles from a hairy black hole and derive the generic tunneling spectrum of released particles,ignoring self-gravitational and interaction.It is studied to ignore the back-reaction impact of the radiated particle on the hairy black hole.We analyze the properties of the black hole,such as temperature and entropy,under the influence of quantum gravity and also observe that the firstorder correction is present.We study tunneling radiation produced by a charged field equation in the presence of a generalized uncertainty effect.We modify the semi-classical technique by using the generalized uncertainty principle,the WKB approximation,and surface gravity.
基金the Deanship of Graduate Studies and Scientific Research at University of Bisha,Saudi Arabia for funding this research work through the Promising Program under Grant Number(UB-Promising-42-1445).
文摘Wind power forecasting plays a crucial role in optimizing the integration of wind energy into the grid by predicting wind patterns and energy output.This enhances the efficiency and reliability of renewable energy systems.Forecasting approaches inform energy management strategies,reduce reliance on fossil fuels,and support the broader transition to sustainable energy solutions.The primary goal of this study is to introduce an effective methodology for estimating wind power through temporal data analysis.This research advances an optimized Multilayer Perceptron(MLP)model using recently proposedmetaheuristic optimization algorithms,namely the FireHawk Optimizer(FHO)and the Non-Monopolize Search(NO).A modified version of FHO,termed FHONO,is developed by integrating NO as a local search mechanism to enhance the exploration capability and address the shortcomings of the original FHO.The developed FHONO is then employed to optimize the MLP for enhanced wind power prediction.The effectiveness of the proposed FHONO-MLP model is validated using renowned datasets from wind turbines in France.The results of the comparative analysis between FHONO-MLP,conventionalMLP,and other optimized versions of MLP show that FHONO-MLP outperforms the others,achieving an average RootMean Square Error(RMSE)of 0.105,Mean Absolute Error(MAE)of 0.082,and Coefficient of Determination(R^(2))of 0.967 across all datasets.These findings underscore the significant enhancement in predictive accuracy provided by FHONO and demonstrate its effectiveness in improving wind power forecasting.
基金supported by the Deanship of Scientific Research,Vice Presidency for Graduate Studies and Scientific Research,King Faisal University,Saudi Arabia(Grant No.KFU251428).
文摘Early detection of Alzheimer’s disease(AD)is crucial,particularly in resource-constrained medical settings.This study introduces an optimized deep learning framework that conceptualizes neural networks as computational“sensors”for neurodegenerative diagnosis,incorporating feature selection,selective layer unfreezing,pruning,and algorithmic optimization.An enhanced lightweight hybrid DenseNet201 model is proposed,integrating layer pruning strategies for feature selection and bioinspired optimization techniques,including Genetic Algorithm(GA)and Harris Hawks Optimization(HHO),for hyperparameter tuning.Layer pruning helps identify and eliminate less significant features,while model parameter optimization further enhances performance by fine-tuning critical hyperparameters,improving convergence speed,and maximizing classification accuracy.GA is also used to reduce the number of selected features further.A detailed comparison of six AD classification model setups is provided to illustrate the variations and their impact on performance.Applying the lightweight hybrid DenseNet201 model for MRI-based AD classification yielded an impressive baseline F1 score of 98%.Overall feature reduction reached 51.75%,enhancing interpretability and lowering processing costs.The optimized models further demonstrated perfect generalization,achieving 100%classification accuracy.These findings underscore the potential of advanced optimization techniques in developing efficient and accurate AD diagnostic tools suitable for environments with limited computational resources.
文摘Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urgency of demand at disaster-affected sites. Firstly, urgency cost, economic cost, and transportation distance cost were identified as key objectives. The study applied fuzzy theory integration to construct a triangular fuzzy multi-objective site selection decision model. Next, the defuzzification theory transformed the fuzzy decision model into a precise one. Subsequently, an improved Chaotic Quantum Multi-Objective Harris Hawks Optimization (CQ-MOHHO) algorithm was proposed to solve the model. The CQ-MOHHO algorithm was shown to rapidly produce high-quality Pareto front solutions and identify optimal site selection schemes for emergency resource distribution centers through case studies. This outcome verified the feasibility and efficacy of the site selection decision model and the CQ-MOHHO algorithm. To further assess CQ-MOHHO’s performance, Zitzler-Deb-Thiele (ZDT) test functions, commonly used in multi-objective optimization, were employed. Comparisons with Multi-Objective Harris Hawks Optimization (MOHHO), Non-dominated Sorting Genetic Algorithm II (NSGA-II), and Multi-Objective Grey Wolf Optimizer (MOGWO) using Generational Distance (GD), Hypervolume (HV), and Inverted Generational Distance (IGD) metrics showed that CQ-MOHHO achieved superior global search ability, faster convergence, and higher solution quality. The CQ-MOHHO algorithm efficiently achieved a balance between multiple objectives, providing decision-makers with satisfactory solutions and a valuable reference for researching and applying emergency site selection problems.
文摘Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex system interactions and multiple operational constraints.This study develops a novel Multi-Neighborhood Enhanced Harris Hawks Optimization(MNEHHO)algorithm to address the allocation of HRES components.The proposed approach integrates key technical parameters,including charge-discharge efficiency,storage device configurations,and renewable energy fraction.We formulate a comprehensive mathematical model that simultaneously minimizes levelized energy costs and pollutant emissions while maintaining system reliability.The MNEHHO algorithm employs multiple neighborhood structures to enhance solution diversity and exploration capabilities.The model’s effectiveness is validated through case studies across four distinct institutional energy demand profiles.Results demonstrate that our approach successfully generates practically feasible HRES configurations while achieving significant reductions in costs and emissions compared to conventional methods.The enhanced search mechanisms of MNEHHO show superior performance in avoiding local optima and achieving consistent solutions.Experimental results demonstrate concrete improvements in solution quality(up to 46% improvement in objective value)and computational efficiency(average coefficient of variance of 24%-27%)across diverse institutional settings.This confirms the robustness and scalability of our method under various operational scenarios,providing a reliable framework for solving renewable energy allocation problems.
基金Supported by Hunan Province Traditional Chinese Medicine Research Project(No.B2023043)Hunan Provincial Department of Education Scientific Research Project(No.22B0386)Hunan University of Traditional Chinese Medicine Campus level Research Fund Project(No.2022XJZKC004).
文摘AIM:To develop a classifier for traditional Chinese medicine(TCM)syndrome differentiation of diabetic retinopathy(DR),using optimized machine learning algorithms,which can provide the basis for TCM objective and intelligent syndrome differentiation.METHODS:Collated data on real-world DR cases were collected.A variety of machine learning methods were used to construct TCM syndrome classification model,and the best performance was selected as the basic model.Genetic Algorithm(GA)was used for feature selection to obtain the optimal feature combination.Harris Hawk Optimization(HHO)was used for parameter optimization,and a classification model based on feature selection and parameter optimization was constructed.The performance of the model was compared with other optimization algorithms.The models were evaluated with accuracy,precision,recall,and F1 score as indicators.RESULTS:Data on 970 cases that met screening requirements were collected.Support Vector Machine(SVM)was the best basic classification model.The accuracy rate of the model was 82.05%,the precision rate was 82.34%,the recall rate was 81.81%,and the F1 value was 81.76%.After GA screening,the optimal feature combination contained 37 feature values,which was consistent with TCM clinical practice.The model based on optimal combination and SVM(GA_SVM)had an accuracy improvement of 1.92%compared to the basic classifier.SVM model based on HHO and GA optimization(HHO_GA_SVM)had the best performance and convergence speed compared with other optimization algorithms.Compared with the basic classification model,the accuracy was improved by 3.51%.CONCLUSION:HHO and GA optimization can improve the model performance of SVM in TCM syndrome differentiation of DR.It provides a new method and research idea for TCM intelligent assisted syndrome differentiation.
基金the Deputyship for Research and Innovation,“Ministry of Education”in Saudi Arabia for funding this research(IFKSUOR3-014-3).
文摘In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.
文摘Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.
基金supported by the National Natural Science Foundation of China(Grant No.11975145)。
文摘The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In this study,we investigate thermal geometries and Hawking evaporation of the recently proposed topological dyonic dilaton BH in anti-de Sitter(Ad S)space.We consider Rényi entropy and obtain the relations for pressure,heat capacity and Gibbs free energy and observe that the Rényi parameter and dilaton field play a vital role in the phase transition and stability of the BH.Moreover,we use Weinhold,Ruppeiner and Hendi Panahiyah Eslam Momennia models to evaluate the scalar curvature of the BH and find out that the divergence points of the scalar curvature coincides with the zero of specific heat.Finally,using Stefan–Boltzmann law,we determine that the BH without a dilaton field evaporates far more quickly compared to the dilaton BH in Ad S space.
文摘In the study of Terrestrial Gamma-ray Flashes (TGFs) and Sonoluminescence, we observe parallels with larger cosmic events. Specifically, sonoluminescence involves the rapid collapse of bubbles, which closely resembles gravitational collapse in space. This observation suggests the potential formation of low-density quantum black holes. These entities, which might be related to dark matter, are thought to experience a kind of transient evaporation similar to Hawking radiation seen in cosmic black holes. Consequently, sonoluminescence could be a valuable tool for investigating phenomena typically linked to cosmic scale events. Furthermore, the role of the Higgs boson is considered in this context, possibly connecting it to both TGFs and sonoluminescence. This research could enhance our understanding of the quantum mechanics of black holes and their relation to dark matter on Earth.
文摘Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble temperature and the maximum Planck temperature of the expanding universe over the course of cosmic time. This mathematical discovery suggests a re-consideration of Rh=ctcosmological models, including black hole cosmological models, even if it possibly could also be consistent with the Λ-CDM model. Most importantly, this paper contributes to the growing literature in the past year asserting a tightly constrained mathematical relationship between the CMB temperature, the Hubble constant, and other global parameters of the Hubble sphere. Our approach suggests a solid theoretical framework for predicting and understanding the CMB temperature rather than solely observing it.1.
文摘In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.
文摘Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).Firstly,we used a Gaussian chaotic mapping strategy to initialize the positions of individuals in the population,which enriches the initial individual species characteristics.Secondly,by optimizing the energy parameter and introducing the cosine strategy,the algorithm's ability to jump out of the local optimum is enhanced,which improves the performance of the algorithm.Finally,comparison experiments with other intelligent algorithms were conducted on 13 classical test function sets.The results show that GHHO has better performance in all aspects compared to other optimization algorithms.The improved algorithm is more suitable for generalization to real optimization problems.