Smallholder farming in West Africa faces various challenges, such as limited access to seeds, fertilizers, modern mechanization, and agricultural climate services. Crop productivity obtained under these conditions var...Smallholder farming in West Africa faces various challenges, such as limited access to seeds, fertilizers, modern mechanization, and agricultural climate services. Crop productivity obtained under these conditions varies significantly from one farmer to another, making it challenging to accurately estimate crop production through crop models. This limitation has implications for the reliability of using crop models as agricultural decision-making support tools. To support decision making in agriculture, an approach combining a genetic algorithm (GA) with the crop model AquaCrop is proposed for a location-specific calibration of maize cropping. In this approach, AquaCrop is used to simulate maize crop yield while the GA is used to derive optimal parameters set at grid cell resolution from various combinations of cultivar parameters and crop management in the process of crop and management options calibration. Statistics on pairwise simulated and observed yields indicate that the coefficient of determination varies from 0.20 to 0.65, with a yield deviation ranging from 8% to 36% across Burkina Faso (BF). An analysis of the optimal parameter sets shows that regardless of the climatic zone, a base temperature of 10˚C and an upper temperature of 32˚C is observed in at least 50% of grid cells. The growing season length and the harvest index vary significantly across BF, with the highest values found in the Soudanian zone and the lowest values in the Sahelian zone. Regarding management strategies, the fertility mean rate is approximately 35%, 39%, and 49% for the Sahelian, Soudano-sahelian, and Soudanian zones, respectively. The mean weed cover is around 36%, with the Sahelian and Soudano-sahelian zones showing the highest variability. The proposed approach can be an alternative to the conventional one-size-fits-all approach commonly used for regional crop modeling. Moreover, it has the potential to explore the performance of cropping strategies to adapt to changing climate conditions.展开更多
The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and stron...The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and strong global search capabilities,this algorithm finds application across diverse optimization problem domains.However,in the face of increasingly complex optimization challenges,the Bat algorithm encounters certain limitations,such as slow convergence and sensitivity to initial solutions.In order to tackle these challenges,the present study incorporates a range of optimization compo-nents into the Bat algorithm,thereby proposing a variant called PKEBA.A projection screening strategy is implemented to mitigate its sensitivity to initial solutions,thereby enhancing the quality of the initial solution set.A kinetic adaptation strategy reforms exploration patterns,while an elite communication strategy enhances group interaction,to avoid algorithm from local optima.Subsequently,the effectiveness of the proposed PKEBA is rigorously evaluated.Testing encompasses 30 benchmark functions from IEEE CEC2014,featuring ablation experiments and comparative assessments against classical algorithms and their variants.Moreover,real-world engineering problems are employed as further validation.The results conclusively demonstrate that PKEBA ex-hibits superior convergence and precision compared to existing algorithms.展开更多
Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their sim...Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.展开更多
Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple dat...Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.展开更多
In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assesse...In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assessed.Two dispatch algorithms,named Cycle Charging and Load Following,are implemented to find the optimal solution(i.e.,net cost,operation cost,carbon emission.energy cost,component sizing,etc.)of the hybrid system.The microgrid is also modeled in the DIgSILENT Power Factory platform,and the respective power system responses are then evaluated.The development of dispatch algorithms specifically tailored for agricultural applications has enabled to dynamically manage energy flows,responding to fluctuating demands and resource availability in real-time.Through careful consideration of factors such as seasonal variations and irrigation requirements,these algorithms have enhanced the resilience and adaptability of the microgrid to dynamic operational conditions.However,it is revealed that both approaches have produced the same techno-economic results showing no significant difference.This illustrates the fact that the considered microgrid can be implemented with either strategy without significant fluctuation in performance.The study has shown that the harmful gas emission has also been limited to only 17,928 kg/year of CO_(2),and 77.7 kg/year of Sulfur Dioxide.For the proposed microgrid and load profile of 165.29 kWh/day,the net present cost is USD 718,279,and the cost of energy is USD 0.0463 with a renewable fraction of 97.6%.The optimal sizes for PV,Bio,Grid,Electrolyzer,and Converter are 1494,500,999,999,500,and 495 kW,respectively.For a hydrogen tank(HTank),the optimal size is found to be 350 kg.This research work provides critical insights into the techno-economic feasibility and environmental impact of integrating biomass-PV-hydrogen storage-Grid hybrid renewable microgrids into agricultural settings.展开更多
Efficient elastic wave focusing is crucial in materials and physical engineering.Elastic coding metasurfaces,which are innovative planar artificial structures,show great potential for use in the field of wave focusing...Efficient elastic wave focusing is crucial in materials and physical engineering.Elastic coding metasurfaces,which are innovative planar artificial structures,show great potential for use in the field of wave focusing.However,elastic coding lenses(ECLs)still suffer from low focusing performance,thickness comparable to wavelength,and frequency sensitivity.Here,we consider both the structural and material properties of the coding unit,thus realizing further compression of the thickness of the ECL.We chose the simplest ECL,which consists of only two encoding units.The coding unit 0 is a straight structure constructed using a carbon fiber reinforced composite material,and the coding unit 1 is a zigzag structure constructed using an aluminum material,and the thickness of the ECL constructed using them is only 1/8 of the wavelength.Based on the theoretical design,the arrangement of coding units is further optimized using genetic algorithms,which significantly improves the focusing performance of the lens at different focus and frequencies.This study provides a more effective way to control vibration and noise in advanced structures.展开更多
We explored the effects of algorithmic opacity on employees’playing dumb and evasive hiding rather than rationalized hiding.We examined the mediating role of job insecurity and the moderating role of employee-AI coll...We explored the effects of algorithmic opacity on employees’playing dumb and evasive hiding rather than rationalized hiding.We examined the mediating role of job insecurity and the moderating role of employee-AI collaboration.Participants were 421 full-time employees(female=46.32%,junior employees=31.83%)from a variety of organizations and industries that interact with AI.Employees filled out data on algorithm opacity,job insecurity,knowledge hiding,employee-AI collaboration,and control variables.The results of the structural equation modeling indicated that algorithm opacity exacerbated employees’job insecurity,and job insecurity mediated between algorithm opacity and playing dumb and evasive hiding rather than rationalized hiding.The relationship between algorithmic opacity and playing dumb and evasive hiding was more positive when the level of employee-AI collaboration was higher.These findings suggest that employee-AI collaboration reinforces the indirect relationship between algorithmic opacity and playing dumb and evasive hiding.Our study contributes to research on human and AI collaboration by exploring the dark side of employee-AI collaboration.展开更多
An improved estimation of distribution algorithm(IEDA)is proposed in this paper for efficient design of metamaterial absorbers.This algorithm establishes a probability model through the selected dominant groups and sa...An improved estimation of distribution algorithm(IEDA)is proposed in this paper for efficient design of metamaterial absorbers.This algorithm establishes a probability model through the selected dominant groups and samples from the model to obtain the next generation,avoiding the problem of building-blocks destruction caused by crossover and mutation.Neighboring search from artificial bee colony algorithm(ABCA)is introduced to enhance the local optimization ability and improved to raise the speed of convergence.The probability model is modified by boundary correction and loss correction to enhance the robustness of the algorithm.The proposed IEDA is compared with other intelligent algorithms in relevant references.The results show that the proposed IEDA has faster convergence speed and stronger optimization ability,proving the feasibility and effectiveness of the algorithm.展开更多
When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes...When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.展开更多
In recent years,the development of new types of nuclear reactors,such as transportable,marine,and space reactors,has presented new challenges for the optimization of reactor radiation-shielding design.Shielding struct...In recent years,the development of new types of nuclear reactors,such as transportable,marine,and space reactors,has presented new challenges for the optimization of reactor radiation-shielding design.Shielding structures typically need to be lightweight,miniaturized,and radiation-protected,which is a multi-parameter and multi-objective optimization problem.The conventional multi-objective(two or three objectives)optimization method for radiation-shielding design exhibits limitations for a number of optimization objectives and variable parameters,as well as a deficiency in achieving a global optimal solution,thereby failing to meet the requirements of shielding optimization for newly developed reactors.In this study,genetic and artificial bee-colony algorithms are combined with a reference-point-selection strategy and applied to the many-objective(having four or more objectives)optimal design of reactor radiation shielding.To validate the reliability of the methods,an optimization simulation is conducted on three-dimensional shielding structures and another complicated shielding-optimization problem.The numerical results demonstrate that the proposed algorithms outperform conventional shielding-design methods in terms of optimization performance,and they exhibit their reliability in practical engineering problems.The many-objective optimization algorithms developed in this study are proven to efficiently and consistently search for Pareto-front shielding schemes.Therefore,the algorithms proposed in this study offer novel insights into improving the shielding-design performance and shielding quality of new reactor types.展开更多
DNA microarrays, a cornerstone in biomedicine, measure gene expression across thousands to tens of thousands of genes. Identifying the genes vital for accurate cancer classification is a key challenge. Here, we presen...DNA microarrays, a cornerstone in biomedicine, measure gene expression across thousands to tens of thousands of genes. Identifying the genes vital for accurate cancer classification is a key challenge. Here, we present Fs-LSA (F-score based Learning Search Algorithm), a novel gene selection algorithm designed to enhance the precision and efficiency of target gene identification from microarray data for cancer classification. This algorithm is divided into two phases: the first leverages F-score values to prioritize and select feature genes with the most significant differential expression;the second phase introduces our Learning Search Algorithm (LSA), which harnesses swarm intelligence to identify the optimal subset among the remaining genes. Inspired by human social learning, LSA integrates historical data and collective intelligence for a thorough search, with a dynamic control mechanism that balances exploration and refinement, thereby enhancing the gene selection process. We conducted a rigorous validation of Fs-LSA’s performance using eight publicly available cancer microarray expression datasets. Fs-LSA achieved accuracy, precision, sensitivity, and F1-score values of 0.9932, 0.9923, 0.9962, and 0.994, respectively. Comparative analyses with state-of-the-art algorithms revealed Fs-LSA’s superior performance in terms of simplicity and efficiency. Additionally, we validated the algorithm’s efficacy independently using glioblastoma data from GEO and TCGA databases. It was significantly superior to those of the comparison algorithms. Importantly, the driver genes identified by Fs-LSA were instrumental in developing a predictive model as an independent prognostic indicator for glioblastoma, underscoring Fs-LSA’s transformative potential in genomics and personalized medicine.展开更多
This paper studies polygon simplification algorithms for 3D models,focuses on the optimization algorithm of quadratic error metric(QEM),explores the impacts of different methods on the simplification of different mode...This paper studies polygon simplification algorithms for 3D models,focuses on the optimization algorithm of quadratic error metric(QEM),explores the impacts of different methods on the simplification of different models,and develops a web-based visualization application.Metrics such as the Hausdorff distance are used to evaluate the balance between the degree of simplification and the retention of model details.展开更多
Based on the Google Earth Engine cloud computing data platform,this study employed three algorithms including Support Vector Machine,Random Forest,and Classification and Regression Tree to classify the current status ...Based on the Google Earth Engine cloud computing data platform,this study employed three algorithms including Support Vector Machine,Random Forest,and Classification and Regression Tree to classify the current status of land covers in Hung Yen province of Vietnam using Landsat 8 OLI satellite images,a free data source with reasonable spatial and temporal resolution.The results of the study show that all three algorithms presented good classification for five basic types of land cover including Rice land,Water bodies,Perennial vegetation,Annual vegetation,Built-up areas as their overall accuracy and Kappa coefficient were greater than 80%and 0.8,respectively.Among the three algorithms,SVM achieved the highest accuracy as its overall accuracy was 86%and the Kappa coefficient was 0.88.Land cover classification based on the SVM algorithm shows that Built-up areas cover the largest area with nearly 31,495 ha,accounting for more than 33.8%of the total natural area,followed by Rice land and Perennial vegetation which cover an area of over 30,767 ha(33%)and 15,637 ha(16.8%),respectively.Water bodies and Annual vegetation cover the smallest areas with 8,820(9.5%)ha and 6,302 ha(6.8%),respectively.The results of this study can be used for land use management and planning as well as other natural resource and environmental management purposes in the province.展开更多
The integration of renewable energy sources into modern power systems necessitates efficient and robust control strategies to address challenges such as power quality,stability,and dynamic environmental variations.Thi...The integration of renewable energy sources into modern power systems necessitates efficient and robust control strategies to address challenges such as power quality,stability,and dynamic environmental variations.This paper presents a novel sparrow search algorithm(SSA)-tuned proportional-integral(PI)controller for grid-connected photovoltaic(PV)systems,designed to optimize dynamic perfor-mance,energy extraction,and power quality.Key contributions include the development of a systematic SSA-based optimization frame-work for real-time PI parameter tuning,ensuring precise voltage and current regulation,improved maximum power point tracking(MPPT)efficiency,and minimized total harmonic distortion(THD).The proposed approach is evaluated against conventional PSO-based and P&O controllers through comprehensive simulations,demonstrating its superior performance across key metrics:a 39.47%faster response time compared to PSO,a 12.06%increase in peak active power relative to P&O,and a 52.38%reduction in THD,ensuring compliance with IEEE grid standards.Moreover,the SSA-tuned PI controller exhibits enhanced adaptability to dynamic irradiancefluc-tuations,rapid response time,and robust grid integration under varying conditions,making it highly suitable for real-time smart grid applications.This work establishes the SSA-tuned PI controller as a reliable and efficient solution for improving PV system performance in grid-connected scenarios,while also setting the foundation for future research into multi-objective optimization,experimental valida-tion,and hybrid renewable energy systems.展开更多
We propose a robust earthquake clustering method:the Bayesian Gaussian mixture model with nearest-neighbor distance(BGMM-NND)algorithm.Unlike the conventional nearest neighbor distance method,the BGMM-NND algorithm el...We propose a robust earthquake clustering method:the Bayesian Gaussian mixture model with nearest-neighbor distance(BGMM-NND)algorithm.Unlike the conventional nearest neighbor distance method,the BGMM-NND algorithm eliminates the need for hyperparameter tuning or reliance on fixed thresholds,offering enhanced flexibility for clustering across varied seismic scales.By integrating cumulative probability and BGMM with principal component analysis(PCA),the BGMM-NND algorithm effectively distinguishes between background and triggered earthquakes while maintaining the magnitude component and resolving the issue of excessively large spatial cluster domains.We apply the BGMM-NND algorithm to the Sichuan–Yunnan seismic catalog from 1971 to 2024,revealing notable variations in earthquake frequency,triggering characteristics,and recurrence patterns across different fault zones.Distinct clustering and triggering behaviors are identified along different segments of the Longmenshan Fault.Multiple seismic modes,namely,the short-distance mode,the medium-distance mode,the repeating-like mode,the uniform background mode,and the Wenchuan mode,are uncovered.The algorithm's flexibility and robust performance in earthquake clustering makes it a valuable tool for exploring seismicity characteristics,offering new insights into earthquake clustering and the spatiotemporal patterns of seismic activity.展开更多
With the rapid advancement of medical artificial intelligence(AI)technology,particularly the widespread adoption of AI diagnostic systems,ethical challenges in medical decision-making have garnered increasing attentio...With the rapid advancement of medical artificial intelligence(AI)technology,particularly the widespread adoption of AI diagnostic systems,ethical challenges in medical decision-making have garnered increasing attention.This paper analyzes the limitations of algorithmic ethics in medical decision-making and explores accountability mechanisms,aiming to provide theoretical support for ethically informed medical practices.The study highlights how the opacity of AI algorithms complicates the definition of decision-making responsibility,undermines doctor-patient trust,and affects informed consent.By thoroughly investigating issues such as the algorithmic“black box”problem and data privacy protection,we develop accountability assessment models to address ethical concerns related to medical resource allocation.Furthermore,this research examines the effective implementation of AI diagnostic systems through case studies of both successful and unsuccessful applications,extracting lessons on accountability mechanisms and response strategies.Finally,we emphasize that establishing a transparent accountability framework is crucial for enhancing the ethical standards of medical AI systems and protecting patients’rights and interests.展开更多
To tackle the path planning problem,this study introduced a novel algorithm called two-stage parameter adjustment-based differential evolution(TPADE).This algorithm draws inspiration from group behavior to implement a...To tackle the path planning problem,this study introduced a novel algorithm called two-stage parameter adjustment-based differential evolution(TPADE).This algorithm draws inspiration from group behavior to implement a two-stage scaling factor variation strategy.In the initial phase,it adapts according to environmental complexity.In the following phase,it combines individual and global experiences to fine-tune the orientation factor,effectively improving its global search capability.Furthermore,this study developed a new population update method,ensuring that well-adapted individuals are retained,which enhances population diversity.In benchmark function tests across different dimensions,the proposed algorithm consistently demonstrates superior convergence accuracy and speed.This study also tested the TPADE algorithm in path planning simulations.The experimental results reveal that the TPADE algorithm outperforms existing algorithms by achieving path lengths of 28.527138 and 31.963990 in simple and complex map environments,respectively.These findings indicate that the proposed algorithm is more adaptive and efficient in path planning.展开更多
This paper introduces a hybrid multi-objective optimization algorithm,designated HMODESFO,which amalgamates the exploratory prowess of Differential Evolution(DE)with the rapid convergence attributes of the Sailfish Op...This paper introduces a hybrid multi-objective optimization algorithm,designated HMODESFO,which amalgamates the exploratory prowess of Differential Evolution(DE)with the rapid convergence attributes of the Sailfish Optimization(SFO)algorithm.The primary objective is to address multi-objective optimization challenges within mechanical engineering,with a specific emphasis on planetary gearbox optimization.The algorithm is equipped with the ability to dynamically select the optimal mutation operator,contingent upon an adaptive normalized population spacing parameter.The efficacy of HMODESFO has been substantiated through rigorous validation against estab-lished industry benchmarks,including a suite of Zitzler-Deb-Thiele(ZDT)and Zeb-Thiele-Laumanns-Zitzler(DTLZ)problems,where it exhibited superior performance.The outcomes underscore the algorithm’s markedly enhanced optimization capabilities relative to existing methods,particularly in tackling highly intricate multi-objective planetary gearbox optimization problems.Additionally,the performance of HMODESFO is evaluated against selected well-known mechanical engineering test problems,further accentuating its adeptness in resolving complex optimization challenges within this domain.展开更多
By comparing price plans offered by several retail energy firms,end users with smart meters and controllers may optimize their energy use cost portfolios,due to the growth of deregulated retail power markets.To help s...By comparing price plans offered by several retail energy firms,end users with smart meters and controllers may optimize their energy use cost portfolios,due to the growth of deregulated retail power markets.To help smart grid end-users decrease power payment and usage unhappiness,this article suggests a decision system based on reinforcement learning to aid with electricity price plan selection.An enhanced state-based Markov decision process(MDP)without transition probabilities simulates the decision issue.A Kernel approximate-integrated batch Q-learning approach is used to tackle the given issue.Several adjustments to the sampling and data representation are made to increase the computational and prediction performance.Using a continuous high-dimensional state space,the suggested approach can uncover the underlying characteristics of time-varying pricing schemes.Without knowing anything regarding the market environment in advance,the best decision-making policy may be learned via case studies that use data from actual historical price plans.Experiments show that the suggested decision approach may reduce cost and energy usage dissatisfaction by using user data to build an accurate prediction strategy.In this research,we look at how smart city energy planners rely on precise load forecasts.It presents a hybrid method that extracts associated characteristics to improve accuracy in residential power consumption forecasts using machine learning(ML).It is possible to measure the precision of forecasts with the use of loss functions with the RMSE.This research presents a methodology for estimating smart home energy usage in response to the growing interest in explainable artificial intelligence(XAI).Using Shapley Additive explanations(SHAP)approaches,this strategy makes it easy for consumers to comprehend their energy use trends.To predict future energy use,the study employs gradient boosting in conjunction with long short-term memory neural networks.展开更多
Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required f...Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required for geotechnical site investigations.This study introduces a hybrid virus colony search(VCS)algorithm that integrates the standard VCS algorithm with a mutation-based search mechanism to develop high-performance XGBoost learning models to address this limitation.A dataset of 372 seismic CPTu and corresponding soil physical properties data from 26 geotechnical projects in Jiangs_(u)Province,China,was collected for model development.Comparative evaluations demonstrate that the proposed hybrid VCS-XGBoost model exhibits s_(u)perior performance compared to standard meta-heuristic algorithm-based XGBoost models.The res_(u)lts highlight that the consideration of soil physical properties significantly improves the predictive accuracy of s_(u),emphasizing the importance of considering additional soil information beyond CPTu data for accurate s_(u)estimation.展开更多
文摘Smallholder farming in West Africa faces various challenges, such as limited access to seeds, fertilizers, modern mechanization, and agricultural climate services. Crop productivity obtained under these conditions varies significantly from one farmer to another, making it challenging to accurately estimate crop production through crop models. This limitation has implications for the reliability of using crop models as agricultural decision-making support tools. To support decision making in agriculture, an approach combining a genetic algorithm (GA) with the crop model AquaCrop is proposed for a location-specific calibration of maize cropping. In this approach, AquaCrop is used to simulate maize crop yield while the GA is used to derive optimal parameters set at grid cell resolution from various combinations of cultivar parameters and crop management in the process of crop and management options calibration. Statistics on pairwise simulated and observed yields indicate that the coefficient of determination varies from 0.20 to 0.65, with a yield deviation ranging from 8% to 36% across Burkina Faso (BF). An analysis of the optimal parameter sets shows that regardless of the climatic zone, a base temperature of 10˚C and an upper temperature of 32˚C is observed in at least 50% of grid cells. The growing season length and the harvest index vary significantly across BF, with the highest values found in the Soudanian zone and the lowest values in the Sahelian zone. Regarding management strategies, the fertility mean rate is approximately 35%, 39%, and 49% for the Sahelian, Soudano-sahelian, and Soudanian zones, respectively. The mean weed cover is around 36%, with the Sahelian and Soudano-sahelian zones showing the highest variability. The proposed approach can be an alternative to the conventional one-size-fits-all approach commonly used for regional crop modeling. Moreover, it has the potential to explore the performance of cropping strategies to adapt to changing climate conditions.
基金partially supported by MRC(MC_PC_17171)Royal Society(RP202G0230)+8 种基金BHF(AA/18/3/34220)Hope Foundation for Cancer Research(RM60G0680)GCRF(20P2PF11)Sino-UK Industrial Fund(RP202G0289)LIAS(20P2ED10,20P2RE969)Data Science Enhancement Fund(20P2RE237)Fight for Sight(24NN201)Sino-UK Education Fund(OP202006)BBSRC(RM32G0178B8).
文摘The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and strong global search capabilities,this algorithm finds application across diverse optimization problem domains.However,in the face of increasingly complex optimization challenges,the Bat algorithm encounters certain limitations,such as slow convergence and sensitivity to initial solutions.In order to tackle these challenges,the present study incorporates a range of optimization compo-nents into the Bat algorithm,thereby proposing a variant called PKEBA.A projection screening strategy is implemented to mitigate its sensitivity to initial solutions,thereby enhancing the quality of the initial solution set.A kinetic adaptation strategy reforms exploration patterns,while an elite communication strategy enhances group interaction,to avoid algorithm from local optima.Subsequently,the effectiveness of the proposed PKEBA is rigorously evaluated.Testing encompasses 30 benchmark functions from IEEE CEC2014,featuring ablation experiments and comparative assessments against classical algorithms and their variants.Moreover,real-world engineering problems are employed as further validation.The results conclusively demonstrate that PKEBA ex-hibits superior convergence and precision compared to existing algorithms.
文摘Data clustering is an essential technique for analyzing complex datasets and continues to be a central research topic in data analysis.Traditional clustering algorithms,such as K-means,are widely used due to their simplicity and efficiency.This paper proposes a novel Spiral Mechanism-Optimized Phasmatodea Population Evolution Algorithm(SPPE)to improve clustering performance.The SPPE algorithm introduces several enhancements to the standard Phasmatodea Population Evolution(PPE)algorithm.Firstly,a Variable Neighborhood Search(VNS)factor is incorporated to strengthen the local search capability and foster population diversity.Secondly,a position update model,incorporating a spiral mechanism,is designed to improve the algorithm’s global exploration and convergence speed.Finally,a dynamic balancing factor,guided by fitness values,adjusts the search process to balance exploration and exploitation effectively.The performance of SPPE is first validated on CEC2013 benchmark functions,where it demonstrates excellent convergence speed and superior optimization results compared to several state-of-the-art metaheuristic algorithms.To further verify its practical applicability,SPPE is combined with the K-means algorithm for data clustering and tested on seven datasets.Experimental results show that SPPE-K-means improves clustering accuracy,reduces dependency on initialization,and outperforms other clustering approaches.This study highlights SPPE’s robustness and efficiency in solving both optimization and clustering challenges,making it a promising tool for complex data analysis tasks.
文摘Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.
基金financed by the Ministry of Science and Technology(MOST)Bangladesh under Special Research grant for the FY 2023-24(SRG 232410)Further,the authors extend their appreciation to the Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabi for funding this research work through the project number“NBU-FFR-2025-3623-05”。
文摘In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assessed.Two dispatch algorithms,named Cycle Charging and Load Following,are implemented to find the optimal solution(i.e.,net cost,operation cost,carbon emission.energy cost,component sizing,etc.)of the hybrid system.The microgrid is also modeled in the DIgSILENT Power Factory platform,and the respective power system responses are then evaluated.The development of dispatch algorithms specifically tailored for agricultural applications has enabled to dynamically manage energy flows,responding to fluctuating demands and resource availability in real-time.Through careful consideration of factors such as seasonal variations and irrigation requirements,these algorithms have enhanced the resilience and adaptability of the microgrid to dynamic operational conditions.However,it is revealed that both approaches have produced the same techno-economic results showing no significant difference.This illustrates the fact that the considered microgrid can be implemented with either strategy without significant fluctuation in performance.The study has shown that the harmful gas emission has also been limited to only 17,928 kg/year of CO_(2),and 77.7 kg/year of Sulfur Dioxide.For the proposed microgrid and load profile of 165.29 kWh/day,the net present cost is USD 718,279,and the cost of energy is USD 0.0463 with a renewable fraction of 97.6%.The optimal sizes for PV,Bio,Grid,Electrolyzer,and Converter are 1494,500,999,999,500,and 495 kW,respectively.For a hydrogen tank(HTank),the optimal size is found to be 350 kg.This research work provides critical insights into the techno-economic feasibility and environmental impact of integrating biomass-PV-hydrogen storage-Grid hybrid renewable microgrids into agricultural settings.
基金Project supported by the National Natural Science Foundation of China(Grant No.12404531)the Natural Science Foundation of the Higher Education Institutions of Jiangsu Province,China(Grant No.23KJB140011)。
文摘Efficient elastic wave focusing is crucial in materials and physical engineering.Elastic coding metasurfaces,which are innovative planar artificial structures,show great potential for use in the field of wave focusing.However,elastic coding lenses(ECLs)still suffer from low focusing performance,thickness comparable to wavelength,and frequency sensitivity.Here,we consider both the structural and material properties of the coding unit,thus realizing further compression of the thickness of the ECL.We chose the simplest ECL,which consists of only two encoding units.The coding unit 0 is a straight structure constructed using a carbon fiber reinforced composite material,and the coding unit 1 is a zigzag structure constructed using an aluminum material,and the thickness of the ECL constructed using them is only 1/8 of the wavelength.Based on the theoretical design,the arrangement of coding units is further optimized using genetic algorithms,which significantly improves the focusing performance of the lens at different focus and frequencies.This study provides a more effective way to control vibration and noise in advanced structures.
基金supported by the Social Science Foundation of Liaoning Province(L23BJY022).
文摘We explored the effects of algorithmic opacity on employees’playing dumb and evasive hiding rather than rationalized hiding.We examined the mediating role of job insecurity and the moderating role of employee-AI collaboration.Participants were 421 full-time employees(female=46.32%,junior employees=31.83%)from a variety of organizations and industries that interact with AI.Employees filled out data on algorithm opacity,job insecurity,knowledge hiding,employee-AI collaboration,and control variables.The results of the structural equation modeling indicated that algorithm opacity exacerbated employees’job insecurity,and job insecurity mediated between algorithm opacity and playing dumb and evasive hiding rather than rationalized hiding.The relationship between algorithmic opacity and playing dumb and evasive hiding was more positive when the level of employee-AI collaboration was higher.These findings suggest that employee-AI collaboration reinforces the indirect relationship between algorithmic opacity and playing dumb and evasive hiding.Our study contributes to research on human and AI collaboration by exploring the dark side of employee-AI collaboration.
基金supported by the National Key Research and Development Program(2021YFB3502500).
文摘An improved estimation of distribution algorithm(IEDA)is proposed in this paper for efficient design of metamaterial absorbers.This algorithm establishes a probability model through the selected dominant groups and samples from the model to obtain the next generation,avoiding the problem of building-blocks destruction caused by crossover and mutation.Neighboring search from artificial bee colony algorithm(ABCA)is introduced to enhance the local optimization ability and improved to raise the speed of convergence.The probability model is modified by boundary correction and loss correction to enhance the robustness of the algorithm.The proposed IEDA is compared with other intelligent algorithms in relevant references.The results show that the proposed IEDA has faster convergence speed and stronger optimization ability,proving the feasibility and effectiveness of the algorithm.
基金supported by the Natural Science Basic Research Program of Shaanxi(Program No.2024JC-YBMS-026).
文摘When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.
基金supported by the National Natural Science Foundation of China(Nos.12475174 and 12175101)Yue Lu Shan Center Industrial Innovation(No.2024YCII0108)。
文摘In recent years,the development of new types of nuclear reactors,such as transportable,marine,and space reactors,has presented new challenges for the optimization of reactor radiation-shielding design.Shielding structures typically need to be lightweight,miniaturized,and radiation-protected,which is a multi-parameter and multi-objective optimization problem.The conventional multi-objective(two or three objectives)optimization method for radiation-shielding design exhibits limitations for a number of optimization objectives and variable parameters,as well as a deficiency in achieving a global optimal solution,thereby failing to meet the requirements of shielding optimization for newly developed reactors.In this study,genetic and artificial bee-colony algorithms are combined with a reference-point-selection strategy and applied to the many-objective(having four or more objectives)optimal design of reactor radiation shielding.To validate the reliability of the methods,an optimization simulation is conducted on three-dimensional shielding structures and another complicated shielding-optimization problem.The numerical results demonstrate that the proposed algorithms outperform conventional shielding-design methods in terms of optimization performance,and they exhibit their reliability in practical engineering problems.The many-objective optimization algorithms developed in this study are proven to efficiently and consistently search for Pareto-front shielding schemes.Therefore,the algorithms proposed in this study offer novel insights into improving the shielding-design performance and shielding quality of new reactor types.
基金supported by the National Natural Science Foundation of China(Grant Number 62341210)Natural Science Foundation of Guangxi Province(Grant Number:2025GXNSFHA069267)Science and Technology Development Plan for Baise City(Grant Number 20233654).
文摘DNA microarrays, a cornerstone in biomedicine, measure gene expression across thousands to tens of thousands of genes. Identifying the genes vital for accurate cancer classification is a key challenge. Here, we present Fs-LSA (F-score based Learning Search Algorithm), a novel gene selection algorithm designed to enhance the precision and efficiency of target gene identification from microarray data for cancer classification. This algorithm is divided into two phases: the first leverages F-score values to prioritize and select feature genes with the most significant differential expression;the second phase introduces our Learning Search Algorithm (LSA), which harnesses swarm intelligence to identify the optimal subset among the remaining genes. Inspired by human social learning, LSA integrates historical data and collective intelligence for a thorough search, with a dynamic control mechanism that balances exploration and refinement, thereby enhancing the gene selection process. We conducted a rigorous validation of Fs-LSA’s performance using eight publicly available cancer microarray expression datasets. Fs-LSA achieved accuracy, precision, sensitivity, and F1-score values of 0.9932, 0.9923, 0.9962, and 0.994, respectively. Comparative analyses with state-of-the-art algorithms revealed Fs-LSA’s superior performance in terms of simplicity and efficiency. Additionally, we validated the algorithm’s efficacy independently using glioblastoma data from GEO and TCGA databases. It was significantly superior to those of the comparison algorithms. Importantly, the driver genes identified by Fs-LSA were instrumental in developing a predictive model as an independent prognostic indicator for glioblastoma, underscoring Fs-LSA’s transformative potential in genomics and personalized medicine.
文摘This paper studies polygon simplification algorithms for 3D models,focuses on the optimization algorithm of quadratic error metric(QEM),explores the impacts of different methods on the simplification of different models,and develops a web-based visualization application.Metrics such as the Hausdorff distance are used to evaluate the balance between the degree of simplification and the retention of model details.
文摘Based on the Google Earth Engine cloud computing data platform,this study employed three algorithms including Support Vector Machine,Random Forest,and Classification and Regression Tree to classify the current status of land covers in Hung Yen province of Vietnam using Landsat 8 OLI satellite images,a free data source with reasonable spatial and temporal resolution.The results of the study show that all three algorithms presented good classification for five basic types of land cover including Rice land,Water bodies,Perennial vegetation,Annual vegetation,Built-up areas as their overall accuracy and Kappa coefficient were greater than 80%and 0.8,respectively.Among the three algorithms,SVM achieved the highest accuracy as its overall accuracy was 86%and the Kappa coefficient was 0.88.Land cover classification based on the SVM algorithm shows that Built-up areas cover the largest area with nearly 31,495 ha,accounting for more than 33.8%of the total natural area,followed by Rice land and Perennial vegetation which cover an area of over 30,767 ha(33%)and 15,637 ha(16.8%),respectively.Water bodies and Annual vegetation cover the smallest areas with 8,820(9.5%)ha and 6,302 ha(6.8%),respectively.The results of this study can be used for land use management and planning as well as other natural resource and environmental management purposes in the province.
文摘The integration of renewable energy sources into modern power systems necessitates efficient and robust control strategies to address challenges such as power quality,stability,and dynamic environmental variations.This paper presents a novel sparrow search algorithm(SSA)-tuned proportional-integral(PI)controller for grid-connected photovoltaic(PV)systems,designed to optimize dynamic perfor-mance,energy extraction,and power quality.Key contributions include the development of a systematic SSA-based optimization frame-work for real-time PI parameter tuning,ensuring precise voltage and current regulation,improved maximum power point tracking(MPPT)efficiency,and minimized total harmonic distortion(THD).The proposed approach is evaluated against conventional PSO-based and P&O controllers through comprehensive simulations,demonstrating its superior performance across key metrics:a 39.47%faster response time compared to PSO,a 12.06%increase in peak active power relative to P&O,and a 52.38%reduction in THD,ensuring compliance with IEEE grid standards.Moreover,the SSA-tuned PI controller exhibits enhanced adaptability to dynamic irradiancefluc-tuations,rapid response time,and robust grid integration under varying conditions,making it highly suitable for real-time smart grid applications.This work establishes the SSA-tuned PI controller as a reliable and efficient solution for improving PV system performance in grid-connected scenarios,while also setting the foundation for future research into multi-objective optimization,experimental valida-tion,and hybrid renewable energy systems.
基金supported by the National Key Research and Development Program of China(Grant Nos.2021YFC3000705 and 2021YFC3000705-05)the National Natural Science Foundation of China(Grant No.42074049)the Youth Innovation Promotion Association of the Chinese Academy of Sciences(Grant No.2023471).
文摘We propose a robust earthquake clustering method:the Bayesian Gaussian mixture model with nearest-neighbor distance(BGMM-NND)algorithm.Unlike the conventional nearest neighbor distance method,the BGMM-NND algorithm eliminates the need for hyperparameter tuning or reliance on fixed thresholds,offering enhanced flexibility for clustering across varied seismic scales.By integrating cumulative probability and BGMM with principal component analysis(PCA),the BGMM-NND algorithm effectively distinguishes between background and triggered earthquakes while maintaining the magnitude component and resolving the issue of excessively large spatial cluster domains.We apply the BGMM-NND algorithm to the Sichuan–Yunnan seismic catalog from 1971 to 2024,revealing notable variations in earthquake frequency,triggering characteristics,and recurrence patterns across different fault zones.Distinct clustering and triggering behaviors are identified along different segments of the Longmenshan Fault.Multiple seismic modes,namely,the short-distance mode,the medium-distance mode,the repeating-like mode,the uniform background mode,and the Wenchuan mode,are uncovered.The algorithm's flexibility and robust performance in earthquake clustering makes it a valuable tool for exploring seismicity characteristics,offering new insights into earthquake clustering and the spatiotemporal patterns of seismic activity.
文摘With the rapid advancement of medical artificial intelligence(AI)technology,particularly the widespread adoption of AI diagnostic systems,ethical challenges in medical decision-making have garnered increasing attention.This paper analyzes the limitations of algorithmic ethics in medical decision-making and explores accountability mechanisms,aiming to provide theoretical support for ethically informed medical practices.The study highlights how the opacity of AI algorithms complicates the definition of decision-making responsibility,undermines doctor-patient trust,and affects informed consent.By thoroughly investigating issues such as the algorithmic“black box”problem and data privacy protection,we develop accountability assessment models to address ethical concerns related to medical resource allocation.Furthermore,this research examines the effective implementation of AI diagnostic systems through case studies of both successful and unsuccessful applications,extracting lessons on accountability mechanisms and response strategies.Finally,we emphasize that establishing a transparent accountability framework is crucial for enhancing the ethical standards of medical AI systems and protecting patients’rights and interests.
基金The National Natural Science Foundation of China(No.62272239,62303214)Jiangsu Agricultural Science and Tech-nology Independent Innovation Fund(No.SJ222051).
文摘To tackle the path planning problem,this study introduced a novel algorithm called two-stage parameter adjustment-based differential evolution(TPADE).This algorithm draws inspiration from group behavior to implement a two-stage scaling factor variation strategy.In the initial phase,it adapts according to environmental complexity.In the following phase,it combines individual and global experiences to fine-tune the orientation factor,effectively improving its global search capability.Furthermore,this study developed a new population update method,ensuring that well-adapted individuals are retained,which enhances population diversity.In benchmark function tests across different dimensions,the proposed algorithm consistently demonstrates superior convergence accuracy and speed.This study also tested the TPADE algorithm in path planning simulations.The experimental results reveal that the TPADE algorithm outperforms existing algorithms by achieving path lengths of 28.527138 and 31.963990 in simple and complex map environments,respectively.These findings indicate that the proposed algorithm is more adaptive and efficient in path planning.
基金supported by the Serbian Ministry of Education and Science under Grant No.TR35006 and COST Action:CA23155—A Pan-European Network of Ocean Tribology(OTC)The research of B.Rosic and M.Rosic was supported by the Serbian Ministry of Education and Science under Grant TR35029.
文摘This paper introduces a hybrid multi-objective optimization algorithm,designated HMODESFO,which amalgamates the exploratory prowess of Differential Evolution(DE)with the rapid convergence attributes of the Sailfish Optimization(SFO)algorithm.The primary objective is to address multi-objective optimization challenges within mechanical engineering,with a specific emphasis on planetary gearbox optimization.The algorithm is equipped with the ability to dynamically select the optimal mutation operator,contingent upon an adaptive normalized population spacing parameter.The efficacy of HMODESFO has been substantiated through rigorous validation against estab-lished industry benchmarks,including a suite of Zitzler-Deb-Thiele(ZDT)and Zeb-Thiele-Laumanns-Zitzler(DTLZ)problems,where it exhibited superior performance.The outcomes underscore the algorithm’s markedly enhanced optimization capabilities relative to existing methods,particularly in tackling highly intricate multi-objective planetary gearbox optimization problems.Additionally,the performance of HMODESFO is evaluated against selected well-known mechanical engineering test problems,further accentuating its adeptness in resolving complex optimization challenges within this domain.
文摘By comparing price plans offered by several retail energy firms,end users with smart meters and controllers may optimize their energy use cost portfolios,due to the growth of deregulated retail power markets.To help smart grid end-users decrease power payment and usage unhappiness,this article suggests a decision system based on reinforcement learning to aid with electricity price plan selection.An enhanced state-based Markov decision process(MDP)without transition probabilities simulates the decision issue.A Kernel approximate-integrated batch Q-learning approach is used to tackle the given issue.Several adjustments to the sampling and data representation are made to increase the computational and prediction performance.Using a continuous high-dimensional state space,the suggested approach can uncover the underlying characteristics of time-varying pricing schemes.Without knowing anything regarding the market environment in advance,the best decision-making policy may be learned via case studies that use data from actual historical price plans.Experiments show that the suggested decision approach may reduce cost and energy usage dissatisfaction by using user data to build an accurate prediction strategy.In this research,we look at how smart city energy planners rely on precise load forecasts.It presents a hybrid method that extracts associated characteristics to improve accuracy in residential power consumption forecasts using machine learning(ML).It is possible to measure the precision of forecasts with the use of loss functions with the RMSE.This research presents a methodology for estimating smart home energy usage in response to the growing interest in explainable artificial intelligence(XAI).Using Shapley Additive explanations(SHAP)approaches,this strategy makes it easy for consumers to comprehend their energy use trends.To predict future energy use,the study employs gradient boosting in conjunction with long short-term memory neural networks.
基金funded by the National Science Fund for Distinguished Young Scholars(Grant No.42225206)the National Key R&D Program of China(Grant No.2020YFC1807200)the National Natural Science Foundation of China(Grant No.42072299).
文摘Conventional empirical equations for estimating undrained shear strength(s_(u))from piezocone penetration test(CPTu)data,without incorporating soil physical properties,often lack the accuracy and robustness required for geotechnical site investigations.This study introduces a hybrid virus colony search(VCS)algorithm that integrates the standard VCS algorithm with a mutation-based search mechanism to develop high-performance XGBoost learning models to address this limitation.A dataset of 372 seismic CPTu and corresponding soil physical properties data from 26 geotechnical projects in Jiangs_(u)Province,China,was collected for model development.Comparative evaluations demonstrate that the proposed hybrid VCS-XGBoost model exhibits s_(u)perior performance compared to standard meta-heuristic algorithm-based XGBoost models.The res_(u)lts highlight that the consideration of soil physical properties significantly improves the predictive accuracy of s_(u),emphasizing the importance of considering additional soil information beyond CPTu data for accurate s_(u)estimation.