Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositori...Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositories results in increased overload due to the adoption of cloud services.Moreover,the migration of applications on the cloud with optimized resource allocation is a herculean task even though it is employed for minimizing the dilemma of allocating resources.In this paper,a Fire Hawk Optimization enabled Deep Learning Scheme(FHOEDLS)is proposed for minimizing the overload and optimizing the resource allocation on the hybrid cloud container architecture for migrating interoperability based applications This FHOEDLS achieves the load prediction through the utilization of deep CNN-GRU-AM model for attaining resource allocation and better migration of applications.It specifically adopted the Fire Hawk Optimization Algorithm(FHOA)for optimizing the parameters that influence the factors that aid in better interoperable application migration with improved resource allocation and minimized overhead.It considered the factors of resource capacity,transmission cost,demand,and predicted load into account during the formulation of the objective function utilized for resource allocation and application migration.The cloud simulation of this FHOEDLS is achieved using a container,Virtual Machine(VM),and Physical Machine(PM).The results of this proposed FHOEDLS confirmed a better resource capability of 0.418 and a minimized load of 0.0061.展开更多
Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urge...Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urgency of demand at disaster-affected sites. Firstly, urgency cost, economic cost, and transportation distance cost were identified as key objectives. The study applied fuzzy theory integration to construct a triangular fuzzy multi-objective site selection decision model. Next, the defuzzification theory transformed the fuzzy decision model into a precise one. Subsequently, an improved Chaotic Quantum Multi-Objective Harris Hawks Optimization (CQ-MOHHO) algorithm was proposed to solve the model. The CQ-MOHHO algorithm was shown to rapidly produce high-quality Pareto front solutions and identify optimal site selection schemes for emergency resource distribution centers through case studies. This outcome verified the feasibility and efficacy of the site selection decision model and the CQ-MOHHO algorithm. To further assess CQ-MOHHO’s performance, Zitzler-Deb-Thiele (ZDT) test functions, commonly used in multi-objective optimization, were employed. Comparisons with Multi-Objective Harris Hawks Optimization (MOHHO), Non-dominated Sorting Genetic Algorithm II (NSGA-II), and Multi-Objective Grey Wolf Optimizer (MOGWO) using Generational Distance (GD), Hypervolume (HV), and Inverted Generational Distance (IGD) metrics showed that CQ-MOHHO achieved superior global search ability, faster convergence, and higher solution quality. The CQ-MOHHO algorithm efficiently achieved a balance between multiple objectives, providing decision-makers with satisfactory solutions and a valuable reference for researching and applying emergency site selection problems.展开更多
Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex syst...Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex system interactions and multiple operational constraints.This study develops a novel Multi-Neighborhood Enhanced Harris Hawks Optimization(MNEHHO)algorithm to address the allocation of HRES components.The proposed approach integrates key technical parameters,including charge-discharge efficiency,storage device configurations,and renewable energy fraction.We formulate a comprehensive mathematical model that simultaneously minimizes levelized energy costs and pollutant emissions while maintaining system reliability.The MNEHHO algorithm employs multiple neighborhood structures to enhance solution diversity and exploration capabilities.The model’s effectiveness is validated through case studies across four distinct institutional energy demand profiles.Results demonstrate that our approach successfully generates practically feasible HRES configurations while achieving significant reductions in costs and emissions compared to conventional methods.The enhanced search mechanisms of MNEHHO show superior performance in avoiding local optima and achieving consistent solutions.Experimental results demonstrate concrete improvements in solution quality(up to 46% improvement in objective value)and computational efficiency(average coefficient of variance of 24%-27%)across diverse institutional settings.This confirms the robustness and scalability of our method under various operational scenarios,providing a reliable framework for solving renewable energy allocation problems.展开更多
Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been...Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.展开更多
The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In...The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In this study,we investigate thermal geometries and Hawking evaporation of the recently proposed topological dyonic dilaton BH in anti-de Sitter(Ad S)space.We consider Rényi entropy and obtain the relations for pressure,heat capacity and Gibbs free energy and observe that the Rényi parameter and dilaton field play a vital role in the phase transition and stability of the BH.Moreover,we use Weinhold,Ruppeiner and Hendi Panahiyah Eslam Momennia models to evaluate the scalar curvature of the BH and find out that the divergence points of the scalar curvature coincides with the zero of specific heat.Finally,using Stefan–Boltzmann law,we determine that the BH without a dilaton field evaporates far more quickly compared to the dilaton BH in Ad S space.展开更多
Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble tempe...Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble temperature and the maximum Planck temperature of the expanding universe over the course of cosmic time. This mathematical discovery suggests a re-consideration of Rh=ctcosmological models, including black hole cosmological models, even if it possibly could also be consistent with the Λ-CDM model. Most importantly, this paper contributes to the growing literature in the past year asserting a tightly constrained mathematical relationship between the CMB temperature, the Hubble constant, and other global parameters of the Hubble sphere. Our approach suggests a solid theoretical framework for predicting and understanding the CMB temperature rather than solely observing it.1.展开更多
In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate pr...In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.展开更多
Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).F...Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).Firstly,we used a Gaussian chaotic mapping strategy to initialize the positions of individuals in the population,which enriches the initial individual species characteristics.Secondly,by optimizing the energy parameter and introducing the cosine strategy,the algorithm's ability to jump out of the local optimum is enhanced,which improves the performance of the algorithm.Finally,comparison experiments with other intelligent algorithms were conducted on 13 classical test function sets.The results show that GHHO has better performance in all aspects compared to other optimization algorithms.The improved algorithm is more suitable for generalization to real optimization problems.展开更多
文摘Virtualization is an indispensable part of the cloud for the objective of deploying different virtual servers over the same physical layer.However,the increase in the number of applications executing on the repositories results in increased overload due to the adoption of cloud services.Moreover,the migration of applications on the cloud with optimized resource allocation is a herculean task even though it is employed for minimizing the dilemma of allocating resources.In this paper,a Fire Hawk Optimization enabled Deep Learning Scheme(FHOEDLS)is proposed for minimizing the overload and optimizing the resource allocation on the hybrid cloud container architecture for migrating interoperability based applications This FHOEDLS achieves the load prediction through the utilization of deep CNN-GRU-AM model for attaining resource allocation and better migration of applications.It specifically adopted the Fire Hawk Optimization Algorithm(FHOA)for optimizing the parameters that influence the factors that aid in better interoperable application migration with improved resource allocation and minimized overhead.It considered the factors of resource capacity,transmission cost,demand,and predicted load into account during the formulation of the objective function utilized for resource allocation and application migration.The cloud simulation of this FHOEDLS is achieved using a container,Virtual Machine(VM),and Physical Machine(PM).The results of this proposed FHOEDLS confirmed a better resource capability of 0.418 and a minimized load of 0.0061.
文摘Addressing the complex issue of emergency resource distribution center site selection in uncertain environments, this study was conducted to comprehensively consider factors such as uncertainty parameters and the urgency of demand at disaster-affected sites. Firstly, urgency cost, economic cost, and transportation distance cost were identified as key objectives. The study applied fuzzy theory integration to construct a triangular fuzzy multi-objective site selection decision model. Next, the defuzzification theory transformed the fuzzy decision model into a precise one. Subsequently, an improved Chaotic Quantum Multi-Objective Harris Hawks Optimization (CQ-MOHHO) algorithm was proposed to solve the model. The CQ-MOHHO algorithm was shown to rapidly produce high-quality Pareto front solutions and identify optimal site selection schemes for emergency resource distribution centers through case studies. This outcome verified the feasibility and efficacy of the site selection decision model and the CQ-MOHHO algorithm. To further assess CQ-MOHHO’s performance, Zitzler-Deb-Thiele (ZDT) test functions, commonly used in multi-objective optimization, were employed. Comparisons with Multi-Objective Harris Hawks Optimization (MOHHO), Non-dominated Sorting Genetic Algorithm II (NSGA-II), and Multi-Objective Grey Wolf Optimizer (MOGWO) using Generational Distance (GD), Hypervolume (HV), and Inverted Generational Distance (IGD) metrics showed that CQ-MOHHO achieved superior global search ability, faster convergence, and higher solution quality. The CQ-MOHHO algorithm efficiently achieved a balance between multiple objectives, providing decision-makers with satisfactory solutions and a valuable reference for researching and applying emergency site selection problems.
文摘Hybrid renewable energy systems(HRES)offer cost-effectiveness,low-emission power solutions,and reduced dependence on fossil fuels.However,the renewable energy allocation problem remains challenging due to complex system interactions and multiple operational constraints.This study develops a novel Multi-Neighborhood Enhanced Harris Hawks Optimization(MNEHHO)algorithm to address the allocation of HRES components.The proposed approach integrates key technical parameters,including charge-discharge efficiency,storage device configurations,and renewable energy fraction.We formulate a comprehensive mathematical model that simultaneously minimizes levelized energy costs and pollutant emissions while maintaining system reliability.The MNEHHO algorithm employs multiple neighborhood structures to enhance solution diversity and exploration capabilities.The model’s effectiveness is validated through case studies across four distinct institutional energy demand profiles.Results demonstrate that our approach successfully generates practically feasible HRES configurations while achieving significant reductions in costs and emissions compared to conventional methods.The enhanced search mechanisms of MNEHHO show superior performance in avoiding local optima and achieving consistent solutions.Experimental results demonstrate concrete improvements in solution quality(up to 46% improvement in objective value)and computational efficiency(average coefficient of variance of 24%-27%)across diverse institutional settings.This confirms the robustness and scalability of our method under various operational scenarios,providing a reliable framework for solving renewable energy allocation problems.
文摘Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.
基金supported by the National Natural Science Foundation of China(Grant No.11975145)。
文摘The thermodynamics of black holes(BHs)has had a profound impact on theoretical physics,providing insight into the nature of gravity,the quantum structure of spacetime and the fundamental laws governing the Universe.In this study,we investigate thermal geometries and Hawking evaporation of the recently proposed topological dyonic dilaton BH in anti-de Sitter(Ad S)space.We consider Rényi entropy and obtain the relations for pressure,heat capacity and Gibbs free energy and observe that the Rényi parameter and dilaton field play a vital role in the phase transition and stability of the BH.Moreover,we use Weinhold,Ruppeiner and Hendi Panahiyah Eslam Momennia models to evaluate the scalar curvature of the BH and find out that the divergence points of the scalar curvature coincides with the zero of specific heat.Finally,using Stefan–Boltzmann law,we determine that the BH without a dilaton field evaporates far more quickly compared to the dilaton BH in Ad S space.
文摘Using a rigorous mathematical approach, we demonstrate how the Cosmic Microwave Background (CMB) temperature could simply be a form of geometric mean temperature between the minimum time-dependent Hawking Hubble temperature and the maximum Planck temperature of the expanding universe over the course of cosmic time. This mathematical discovery suggests a re-consideration of Rh=ctcosmological models, including black hole cosmological models, even if it possibly could also be consistent with the Λ-CDM model. Most importantly, this paper contributes to the growing literature in the past year asserting a tightly constrained mathematical relationship between the CMB temperature, the Hubble constant, and other global parameters of the Hubble sphere. Our approach suggests a solid theoretical framework for predicting and understanding the CMB temperature rather than solely observing it.1.
文摘In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.
文摘Aiming at the problems that the original Harris Hawk optimization algorithm is easy to fall into local optimum and slow in finding the optimum,this paper proposes an improved Harris Hawk optimization algorithm(GHHO).Firstly,we used a Gaussian chaotic mapping strategy to initialize the positions of individuals in the population,which enriches the initial individual species characteristics.Secondly,by optimizing the energy parameter and introducing the cosine strategy,the algorithm's ability to jump out of the local optimum is enhanced,which improves the performance of the algorithm.Finally,comparison experiments with other intelligent algorithms were conducted on 13 classical test function sets.The results show that GHHO has better performance in all aspects compared to other optimization algorithms.The improved algorithm is more suitable for generalization to real optimization problems.