The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade tempe...The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade temperature regulation performance.To address these challenges,we propose a composite control scheme combining fuzzy logic and a variable-gain generalized supertwisting algorithm(VG-GSTA).Firstly,a one-dimensional(1D)fuzzy logic controler(FLC)for the pump ensures stable coolant flow,while a two-dimensional(2D)FLC for the fan regulates the stack temperature near the reference value.The VG-GSTA is then introduced to eliminate steady-state errors,offering resistance to disturbances and minimizing control oscillations.The equilibrium optimizer is used to fine-tune VG-GSTA parameters.Co-simulation verifies the effectiveness of our method,demonstrating its advantages in terms of disturbance immunity,overshoot suppression,tracking accuracy and response speed.展开更多
Algorithms are the primary component of Artificial Intelligence(AI).The algorithm is the process in AI that imitates the human mind to solve problems.Currently evaluating the performance of AI is achieved by evaluatin...Algorithms are the primary component of Artificial Intelligence(AI).The algorithm is the process in AI that imitates the human mind to solve problems.Currently evaluating the performance of AI is achieved by evaluating AI algorithms by metric scores on data sets.However the evaluation of algorithms in AI is challenging because the evaluation of the same type of algorithm has many data sets and evaluation metrics.Different algorithms may have individual strengths and weaknesses in evaluation metric scores on separate data sets,lacking the credibility and validity of the evaluation.Moreover,evaluation of algorithms requires repeated experiments on different data sets,reducing the attention of researchers to the research of the algorithms itself.Crucially,this approach to evaluating comparative metric scores does not take into account the algorithm’s ability to solve problems.And the classical algorithm evaluation of time and space complexity is not suitable for evaluating AI algorithms.Because classical algorithms input is infinite numbers,whereas AI algorithms input is a data set,which is limited and multifarious.According to the AI algorithm evaluation without response to the problem solving capability,this paper summarizes the features of AI algorithm evaluation and proposes an AI evaluation method that incorporates the problem-solving capabilities of algorithms.展开更多
In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assesse...In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assessed.Two dispatch algorithms,named Cycle Charging and Load Following,are implemented to find the optimal solution(i.e.,net cost,operation cost,carbon emission.energy cost,component sizing,etc.)of the hybrid system.The microgrid is also modeled in the DIgSILENT Power Factory platform,and the respective power system responses are then evaluated.The development of dispatch algorithms specifically tailored for agricultural applications has enabled to dynamically manage energy flows,responding to fluctuating demands and resource availability in real-time.Through careful consideration of factors such as seasonal variations and irrigation requirements,these algorithms have enhanced the resilience and adaptability of the microgrid to dynamic operational conditions.However,it is revealed that both approaches have produced the same techno-economic results showing no significant difference.This illustrates the fact that the considered microgrid can be implemented with either strategy without significant fluctuation in performance.The study has shown that the harmful gas emission has also been limited to only 17,928 kg/year of CO_(2),and 77.7 kg/year of Sulfur Dioxide.For the proposed microgrid and load profile of 165.29 kWh/day,the net present cost is USD 718,279,and the cost of energy is USD 0.0463 with a renewable fraction of 97.6%.The optimal sizes for PV,Bio,Grid,Electrolyzer,and Converter are 1494,500,999,999,500,and 495 kW,respectively.For a hydrogen tank(HTank),the optimal size is found to be 350 kg.This research work provides critical insights into the techno-economic feasibility and environmental impact of integrating biomass-PV-hydrogen storage-Grid hybrid renewable microgrids into agricultural settings.展开更多
Cluster-basedmodels have numerous application scenarios in vehicular ad-hoc networks(VANETs)and can greatly help improve the communication performance of VANETs.However,the frequent movement of vehicles can often lead...Cluster-basedmodels have numerous application scenarios in vehicular ad-hoc networks(VANETs)and can greatly help improve the communication performance of VANETs.However,the frequent movement of vehicles can often lead to changes in the network topology,thereby reducing cluster stability in urban scenarios.To address this issue,we propose a clustering model based on the density peak clustering(DPC)method and sparrow search algorithm(SSA),named SDPC.First,the model constructs a fitness function based on the parameters obtained from the DPC method and deploys the SSA for iterative optimization to select cluster heads(CHs).Then,the vehicles that have not been selected as CHs are assigned to appropriate clusters by comprehensively considering the distance parameter and link-reliability parameter.Finally,cluster maintenance strategies are considered to tackle the changes in the clusters’organizational structure.To verify the performance of the model,we conducted a simulation on a real-world scenario for multiple metrics related to clusters’stability.The results show that compared with the APROVE and the GAPC,SDPC showed clear performance advantages,indicating that SDPC can effectively ensure VANETs’cluster stability in urban scenarios.展开更多
Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample va...Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample variance is an important estimate of the interference signal decision threshold.Here,we propose an improved algorithm for calculating data sample variance relying on four established statistical methods:the variance of the trimmed data,winsorized sample variance,median absolute deviation,and median of the trimmed data pairwise averaged squares method.The variance and decision threshold in the protected section of the radio astronomy L-band are calculated.Among the four methods,the improved median of the trimmed data pairwise averaged squares algorithm has higher accuracy,but in a comparison of overall experimental results,the cleanliness rate of all algorithms is above 96%.In a comparison between the improved algorithm and the four methods,the cleanliness rate of the improved algorithm is above 98%,verifying its feasibility.The time-intensity interference distribution in the radio protection band is also obtained.Finally,we use comprehensive monitoring data of radio astronomy protection bands,radio interference bands,and interfered frequency bands to establish a comprehensive evaluation system for radio observatory sites,including the observable time proportion in the radio astronomy protection band,the occasional time-intensity distribution in the radio interference frequency band,and the intensity distribution of the interfered frequency band.展开更多
To solve the Poisson equation it is usually possible to discretize it into solving the corresponding linear system Ax=b.Variational quantum algorithms(VQAs)for the discretized Poisson equation have been studied before...To solve the Poisson equation it is usually possible to discretize it into solving the corresponding linear system Ax=b.Variational quantum algorithms(VQAs)for the discretized Poisson equation have been studied before.We present a VQA based on the banded Toeplitz systems for solving the Poisson equation with respect to the structural features of matrix A.In detail,we decompose the matrices A and A^(2)into a linear combination of the corresponding banded Toeplitz matrix and sparse matrices with only a few non-zero elements.For the one-dimensional Poisson equation with different boundary conditions and the d-dimensional Poisson equation with Dirichlet boundary conditions,the number of decomposition terms is less than that reported in[Phys.Rev.A 2023108,032418].Based on the decomposition of the matrix,we design quantum circuits that efficiently evaluate the cost function.Additionally,numerical simulation verifies the feasibility of the proposed algorithm.Finally,the VQAs for linear systems of equations and matrix-vector multiplications with the K-banded Toeplitz matrix T_(n)^(K)are given,where T_(n)^(K)∈R^(n×n)and K∈O(ploylogn).展开更多
Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algori...Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algorithm to achieve optimal results.In our proposed model,gamma correction and Retinex address color cast issues and enhance image edges and details.The final enhanced image is obtained through color balancing.The BES algorithm seeks the optimal solution through the selection,search,and swooping stages.However,it is prone to getting stuck in local optima and converges slowly.To overcome these limitations,we propose an improved BES algorithm(ABES)with enhanced population learning,position updates,and control parameters.ABES is employed to optimize the core parameters of gamma correction and Retinex to improve image quality,and the maximization of information entropy is utilized as the objective function.Real benchmark images are collected to validate its performance.Experimental results demonstrate that ABES outperforms the existing image enhancement methods,including the flower pollination algorithm,the chimp optimization algorithm,particle swarm optimization,and BES,in terms of information entropy,peak signal-to-noise ratio(PSNR),structural similarity index(SSIM),and patch-based contrast quality index(PCQI).ABES demonstrates superior performance both qualitatively and quantitatively,and it helps enhance prominent features and contrast in the images while maintaining the natural appearance of the original images.展开更多
Since the concept of quantum information masking was proposed by Modi et al(2018 Phys.Rev.Lett.120,230501),many interesting and significant results have been reported,both theoretically and experimentally.However,desi...Since the concept of quantum information masking was proposed by Modi et al(2018 Phys.Rev.Lett.120,230501),many interesting and significant results have been reported,both theoretically and experimentally.However,designing a quantum information masker is not an easy task,especially for larger systems.In this paper,we propose a variational quantum algorithm to resolve this problem.Specifically,our algorithm is a hybrid quantum-classical model,where the quantum device with adjustable parameters tries to mask quantum information and the classical device evaluates the performance of the quantum device and optimizes its parameters.After optimization,the quantum device behaves as an optimal masker.The loss value during optimization can be used to characterize the performance of the masker.In particular,if the loss value converges to zero,we obtain a perfect masker that completely masks the quantum information generated by the quantum information source,otherwise,the perfect masker does not exist and the subsystems always contain the original information.Nevertheless,these resulting maskers are still optimal.Quantum parallelism is utilized to reduce quantum state preparations and measurements.Our study paves the way for wide application of quantum information masking,and some of the techniques used in this study may have potential applications in quantum information processing.展开更多
Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing h...Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.展开更多
Background:In recent years,there has been a growing trend in the utilization of observational studies that make use of routinely collected healthcare data(RCD).These studies rely on algorithms to identify specific hea...Background:In recent years,there has been a growing trend in the utilization of observational studies that make use of routinely collected healthcare data(RCD).These studies rely on algorithms to identify specific health conditions(e.g.,diabetes or sepsis)for statistical analyses.However,there has been substantial variation in the algorithm development and validation,leading to frequently suboptimal performance and posing a significant threat to the validity of study findings.Unfortunately,these issues are often overlooked.Methods:We systematically developed guidance for the development,validation,and evaluation of algorithms designed to identify health status(DEVELOP-RCD).Our initial efforts involved conducting both a narrative review and a systematic review of published studies on the concepts and methodological issues related to algorithm development,validation,and evaluation.Subsequently,we conducted an empirical study on an algorithm for identifying sepsis.Based on these findings,we formulated specific workflow and recommendations for algorithm development,validation,and evaluation within the guidance.Finally,the guidance underwent independent review by a panel of 20 external experts who then convened a consensus meeting to finalize it.Results:A standardized workflow for algorithm development,validation,and evaluation was established.Guided by specific health status considerations,the workflow comprises four integrated steps:assessing an existing algorithm’s suitability for the target health status;developing a new algorithm using recommended methods;validating the algorithm using prescribed performance measures;and evaluating the impact of the algorithm on study results.Additionally,13 good practice recommendations were formulated with detailed explanations.Furthermore,a practical study on sepsis identification was included to demonstrate the application of this guidance.Conclusions:The establishment of guidance is intended to aid researchers and clinicians in the appropriate and accurate development and application of algorithms for identifying health status from RCD.This guidance has the potential to enhance the credibility of findings from observational studies involving RCD.展开更多
Autism Spectrum Disorder(ASD)is a complex neurodevelopmental condition that causes multiple challenges in behavioral and communication activities.In the medical field,the data related to ASD,the security measures are ...Autism Spectrum Disorder(ASD)is a complex neurodevelopmental condition that causes multiple challenges in behavioral and communication activities.In the medical field,the data related to ASD,the security measures are integrated in this research responsibly and effectively to develop the Mobile Neuron Attention Stage-by-Stage Network(MNASNet)model,which is the integration of both Mobile Network(MobileNet)and Neuron Attention Stage-by-Stage.The steps followed to detect ASD with privacy-preserved data are data normalization,data augmentation,and K-Anonymization.The clinical data of individuals are taken initially and preprocessed using the Z-score Normalization.Then,data augmentation is performed using the oversampling technique.Subsequently,K-Anonymization is effectuated by utilizing the Black-winged Kite Algorithm to ensure the privacy of medical data,where the best fitness solution is based on data utility and privacy.Finally,after improving the data privacy,the developed approach MNASNet is implemented for ASD detection,which achieves highly accurate results compared to traditional methods to detect autism behavior.Hence,the final results illustrate that the proposed MNASNet achieves an accuracy of 92.9%,TPR of 95.9%,and TNR of 90.9%at the k-samples of 8.展开更多
As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphas...As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphasized the need for adaptive clustering strategies to improve performance in Intelligent Transportation Systems(ITS).This paper presents the Grasshopper Optimization Algorithm for Vehicular Network Clustering(GOAVNET)algorithm,an innovative approach to optimal vehicular clustering in Vehicular Ad-Hoc Networks(VANETs),leveraging the Grasshopper Optimization Algorithm(GOA)to address the critical challenges of traffic congestion and communication inefficiencies in Intelligent Transportation Systems(ITS).The proposed GOA-VNET employs an iterative and interactive optimization mechanism to dynamically adjust node positions and cluster configurations,ensuring robust adaptability to varying vehicular densities and transmission ranges.Key features of GOA-VNET include the utilization of attraction zone,repulsion zone,and comfort zone parameters,which collectively enhance clustering efficiency and minimize congestion within Regions of Interest(ROI).By managing cluster configurations and node densities effectively,GOA-VNET ensures balanced load distribution and seamless data transmission,even in scenarios with high vehicular densities and varying transmission ranges.Comparative evaluations against the Whale Optimization Algorithm(WOA)and Grey Wolf Optimization(GWO)demonstrate that GOA-VNET consistently outperforms these methods by achieving superior clustering efficiency,reducing the number of clusters by up to 10%in high-density scenarios,and improving data transmission reliability.Simulation results reveal that under a 100-600 m transmission range,GOA-VNET achieves an average reduction of 8%-15%in the number of clusters and maintains a 5%-10%improvement in packet delivery ratio(PDR)compared to baseline algorithms.Additionally,the algorithm incorporates a heat transfer-inspired load-balancing mechanism,ensuring equitable distribution of nodes among cluster leaders(CLs)and maintaining a stable network environment.These results validate GOA-VNET as a reliable and scalable solution for VANETs,with significant potential to support next-generation ITS.Future research could further enhance the algorithm by integrating multi-objective optimization techniques and exploring broader applications in complex traffic scenarios.展开更多
The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resource...The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.展开更多
The objective of this study is to develop an advanced approach to variogram modelling by integrating genetic algorithms(GA)with machine learning-based linear regression,aiming to improve the accuracy and efficiency of...The objective of this study is to develop an advanced approach to variogram modelling by integrating genetic algorithms(GA)with machine learning-based linear regression,aiming to improve the accuracy and efficiency of geostatistical analysis,particularly in mineral exploration.The study combines GA and machine learning to optimise variogram parameters,including range,sill,and nugget,by minimising the root mean square error(RMSE)and maximising the coefficient of determination(R^(2)).The experimental variograms were computed and modelled using theoretical models,followed by optimisation via evolutionary algorithms.The method was applied to gravity data from the Ngoura-Batouri-Kette mining district in Eastern Cameroon,covering 141 data points.Sequential Gaussian Simulations(SGS)were employed for predictive mapping to validate simulated results against true values.Key findings show variograms with ranges between 24.71 km and 49.77 km,opti-mised RMSE and R^(2) values of 11.21 mGal^(2) and 0.969,respectively,after 42 generations of GA optimisation.Predictive mapping using SGS demonstrated that simulated values closely matched true values,with the simu-lated mean at 21.75 mGal compared to the true mean of 25.16 mGal,and variances of 465.70 mGal^(2) and 555.28 mGal^(2),respectively.The results confirmed spatial variability and anisotropies in the N170-N210 directions,consistent with prior studies.This work presents a novel integration of GA and machine learning for variogram modelling,offering an automated,efficient approach to parameter estimation.The methodology significantly enhances predictive geostatistical models,contributing to the advancement of mineral exploration and improving the precision and speed of decision-making in the petroleum and mining industries.展开更多
The adaptive filtering algorithm with a fixed projection order is unable to adjust its performance in response to changes in the external environment of airborne radars.To overcome this limitation,a new approach is in...The adaptive filtering algorithm with a fixed projection order is unable to adjust its performance in response to changes in the external environment of airborne radars.To overcome this limitation,a new approach is introduced,which is the variable projection order Ekblom norm-promoted adaptive algorithm(VPO-EPAA).The method begins by examining the mean squared deviation(MSD)of the EPAA,deriving a formula for its MSD.Next,it compares the MSD of EPAA at two different projection orders and selects the one that minimizes the MSD as the parameter for the current iteration.Furthermore,the algorithm’s computational complexity is analyzed theoretically.Simulation results from system identification and self-interference cancellation show that the proposed algorithm performs exceptionally well in airborne radar signal self-interference cancellation,even under various noise intensities and types of interference.展开更多
In this paper,we propose a new full-Newton step feasible interior-point algorithm for the special weighted linear complementarity problems.The proposed algorithm employs the technique of algebraic equivalent transform...In this paper,we propose a new full-Newton step feasible interior-point algorithm for the special weighted linear complementarity problems.The proposed algorithm employs the technique of algebraic equivalent transformation to derive the search direction.It is shown that the proximity measure reduces quadratically at each iteration.Moreover,the iteration bound of the algorithm is as good as the best-known polynomial complexity for these types of problems.Furthermore,numerical results are presented to show the efficiency of the proposed algorithm.展开更多
Wind energy has emerged as a potential replacement for fossil fuel-based energy sources.To harness maximum wind energy,a crucial decision in the development of an efficient wind farm is the optimal layout design.This ...Wind energy has emerged as a potential replacement for fossil fuel-based energy sources.To harness maximum wind energy,a crucial decision in the development of an efficient wind farm is the optimal layout design.This layout defines the specific locations of the turbines within the wind farm.The process of finding the optimal locations of turbines,in the presence of various technical and technological constraints,makes the wind farm layout design problem a complex optimization problem.This problem has traditionally been solved with nature-inspired algorithms with promising results.The performance and convergence of nature-inspired algorithms depend on several parameters,among which the algorithm termination criterion plays a crucial role.Timely convergence is an important aspect of efficient algorithm design because an inefficient algorithm results in wasted computational resources,unwarranted electricity consumption,and hardware stress.This study provides an in-depth analysis of several termination criteria while using the genetic algorithm as a test bench,with its application to the wind farm layout design problem while considering various wind scenarios.The performance of six termination criteria is empirically evaluated with respect to the quality of solutions produced and the execution time involved.Due to the conflicting nature of these two attributes,fuzzy logic-based multi-attribute decision-making is employed in the decision process.Results for the fuzzy decision approach indicate that among the various criteria tested,the criterion Phi achieves an improvement in the range of 2.44%to 32.93%for wind scenario 1.For scenario 2,Best-worst termination criterion performed well compared to the other criteria evaluated,with an improvement in the range of 1.2%to 9.64%.For scenario 3,Hitting bound was the best performer with an improvement of 1.16%to 20.93%.展开更多
For autonomous Unmanned Aerial Vehicles(UAVs)flying in real-world scenarios,time for path planning is always limited,which is a challenge known as the anytime problem.Anytime planners address this by finding a collisi...For autonomous Unmanned Aerial Vehicles(UAVs)flying in real-world scenarios,time for path planning is always limited,which is a challenge known as the anytime problem.Anytime planners address this by finding a collision-free path quickly and then improving it until time runs out,making UAVs more adaptable to different mission scenarios.However,current anytime algorithms based on A^(*)have insufficient control over the suboptimality bounds of paths and tend to lose their anytime properties in environments with large concave obstacles.This paper proposes a novel anytime path planning algorithm,Anytime Radiation A^(*)(ARa A^(*)),which can generate a series of suboptimal paths with improved bounds through decreasing search step sizes and can generate the optimal path when time is sufficient.The ARa A^(*)features two main innovations:an adaptive variable-step-size mechanism and elliptic constraints based on waypoints.The former helps achieve fast path searching in various environments.The latter allows ARa A^(*)to control the suboptimality bounds of paths and further enhance search efficiency.Simulation experiments show that the ARa A^(*)outperforms Anytime Repairing A^(*)(ARA^(*))and Anytime D^(*)(AD^(*))in controlling suboptimality bounds and planning time,especially in environments with large concave obstacles.Final flight experiments demonstrate that the paths planned by ARa A^(*)can ensure the safe flight of quadrotors.展开更多
BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches....BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.展开更多
The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficie...The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficiency of process optimization or monitoring studies.However,the distillation process is highly nonlinear and has multiple uncertainty perturbation intervals,which brings challenges to accurate data-driven modelling of distillation processes.This paper proposes a systematic data-driven modelling framework to solve these problems.Firstly,data segment variance was introduced into the K-means algorithm to form K-means data interval(KMDI)clustering in order to cluster the data into perturbed and steady state intervals for steady-state data extraction.Secondly,maximal information coefficient(MIC)was employed to calculate the nonlinear correlation between variables for removing redundant features.Finally,extreme gradient boosting(XGBoost)was integrated as the basic learner into adaptive boosting(AdaBoost)with the error threshold(ET)set to improve weights update strategy to construct the new integrated learning algorithm,XGBoost-AdaBoost-ET.The superiority of the proposed framework is verified by applying this data-driven modelling framework to a real industrial process of propylene distillation.展开更多
基金Supported by the Major Science and Technology Project of Jilin Province(20220301010GX)the International Scientific and Technological Cooperation(20240402071GH).
文摘The liquid cooling system(LCS)of fuel cells is challenged by significant time delays,model uncertainties,pump and fan coupling,and frequent disturbances,leading to overshoot and control oscillations that degrade temperature regulation performance.To address these challenges,we propose a composite control scheme combining fuzzy logic and a variable-gain generalized supertwisting algorithm(VG-GSTA).Firstly,a one-dimensional(1D)fuzzy logic controler(FLC)for the pump ensures stable coolant flow,while a two-dimensional(2D)FLC for the fan regulates the stack temperature near the reference value.The VG-GSTA is then introduced to eliminate steady-state errors,offering resistance to disturbances and minimizing control oscillations.The equilibrium optimizer is used to fine-tune VG-GSTA parameters.Co-simulation verifies the effectiveness of our method,demonstrating its advantages in terms of disturbance immunity,overshoot suppression,tracking accuracy and response speed.
基金funded by the General Program of the National Natural Science Foundation of China grant number[62277022].
文摘Algorithms are the primary component of Artificial Intelligence(AI).The algorithm is the process in AI that imitates the human mind to solve problems.Currently evaluating the performance of AI is achieved by evaluating AI algorithms by metric scores on data sets.However the evaluation of algorithms in AI is challenging because the evaluation of the same type of algorithm has many data sets and evaluation metrics.Different algorithms may have individual strengths and weaknesses in evaluation metric scores on separate data sets,lacking the credibility and validity of the evaluation.Moreover,evaluation of algorithms requires repeated experiments on different data sets,reducing the attention of researchers to the research of the algorithms itself.Crucially,this approach to evaluating comparative metric scores does not take into account the algorithm’s ability to solve problems.And the classical algorithm evaluation of time and space complexity is not suitable for evaluating AI algorithms.Because classical algorithms input is infinite numbers,whereas AI algorithms input is a data set,which is limited and multifarious.According to the AI algorithm evaluation without response to the problem solving capability,this paper summarizes the features of AI algorithm evaluation and proposes an AI evaluation method that incorporates the problem-solving capabilities of algorithms.
基金financed by the Ministry of Science and Technology(MOST)Bangladesh under Special Research grant for the FY 2023-24(SRG 232410)Further,the authors extend their appreciation to the Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabi for funding this research work through the project number“NBU-FFR-2025-3623-05”。
文摘In this research work,the localized generation from renewable resources and the distribution of energy to agricultural loads,which is a local microgrid concept,have been considered,and its feasibility has been assessed.Two dispatch algorithms,named Cycle Charging and Load Following,are implemented to find the optimal solution(i.e.,net cost,operation cost,carbon emission.energy cost,component sizing,etc.)of the hybrid system.The microgrid is also modeled in the DIgSILENT Power Factory platform,and the respective power system responses are then evaluated.The development of dispatch algorithms specifically tailored for agricultural applications has enabled to dynamically manage energy flows,responding to fluctuating demands and resource availability in real-time.Through careful consideration of factors such as seasonal variations and irrigation requirements,these algorithms have enhanced the resilience and adaptability of the microgrid to dynamic operational conditions.However,it is revealed that both approaches have produced the same techno-economic results showing no significant difference.This illustrates the fact that the considered microgrid can be implemented with either strategy without significant fluctuation in performance.The study has shown that the harmful gas emission has also been limited to only 17,928 kg/year of CO_(2),and 77.7 kg/year of Sulfur Dioxide.For the proposed microgrid and load profile of 165.29 kWh/day,the net present cost is USD 718,279,and the cost of energy is USD 0.0463 with a renewable fraction of 97.6%.The optimal sizes for PV,Bio,Grid,Electrolyzer,and Converter are 1494,500,999,999,500,and 495 kW,respectively.For a hydrogen tank(HTank),the optimal size is found to be 350 kg.This research work provides critical insights into the techno-economic feasibility and environmental impact of integrating biomass-PV-hydrogen storage-Grid hybrid renewable microgrids into agricultural settings.
文摘Cluster-basedmodels have numerous application scenarios in vehicular ad-hoc networks(VANETs)and can greatly help improve the communication performance of VANETs.However,the frequent movement of vehicles can often lead to changes in the network topology,thereby reducing cluster stability in urban scenarios.To address this issue,we propose a clustering model based on the density peak clustering(DPC)method and sparrow search algorithm(SSA),named SDPC.First,the model constructs a fitness function based on the parameters obtained from the DPC method and deploys the SSA for iterative optimization to select cluster heads(CHs).Then,the vehicles that have not been selected as CHs are assigned to appropriate clusters by comprehensively considering the distance parameter and link-reliability parameter.Finally,cluster maintenance strategies are considered to tackle the changes in the clusters’organizational structure.To verify the performance of the model,we conducted a simulation on a real-world scenario for multiple metrics related to clusters’stability.The results show that compared with the APROVE and the GAPC,SDPC showed clear performance advantages,indicating that SDPC can effectively ensure VANETs’cluster stability in urban scenarios.
基金supported by the Ministry of Science and Technology SKA Special Project(2020SKA0110202)the Special Project on Building a Science and Technology Innovation Center for South and Southeast Asia-International Joint Innovation Platform in Yunnan Province:“Yunnan Sino-Malaysian International Joint Laboratory of HF-VHF Advanced Radio Astronomy Technology”(202303AP140003)+4 种基金the National Natural Science Foundation of China(NSFC)Joint Fund for Astronomy(JFA)incubator program(U2031133)the International Partnership Program Project of the International Cooperation Bureau of the Chinese Academy of Sciences:“Belt and Road”Cooperation(114A11KYSB20200001)the Kunming Foreign(International)Cooperation Base Program:“Yunnan Observatory of the Chinese Academy of Sciences-University of Malaya Joint R&D Cooperation Base for Advanced Radio Astronomy Technology”(GHJD-2021022)the China-Malaysia Collaborative Research on Space Remote Sensing and Radio Astronomy Observation of Space Weather at Low and Middle Latitudes under the Key Special Project of the State Key R&D Program of the Ministry of Science and Technol ogy for International Cooperation in Science,Technology and Innovation among Governments(2022YFE0140000)the High-precision calibration method for low-frequency radio interferometric arrays for the SKA project of the Ministry of Science and Technology(2020SKA0110300).
文摘Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample variance is an important estimate of the interference signal decision threshold.Here,we propose an improved algorithm for calculating data sample variance relying on four established statistical methods:the variance of the trimmed data,winsorized sample variance,median absolute deviation,and median of the trimmed data pairwise averaged squares method.The variance and decision threshold in the protected section of the radio astronomy L-band are calculated.Among the four methods,the improved median of the trimmed data pairwise averaged squares algorithm has higher accuracy,but in a comparison of overall experimental results,the cleanliness rate of all algorithms is above 96%.In a comparison between the improved algorithm and the four methods,the cleanliness rate of the improved algorithm is above 98%,verifying its feasibility.The time-intensity interference distribution in the radio protection band is also obtained.Finally,we use comprehensive monitoring data of radio astronomy protection bands,radio interference bands,and interfered frequency bands to establish a comprehensive evaluation system for radio observatory sites,including the observable time proportion in the radio astronomy protection band,the occasional time-intensity distribution in the radio interference frequency band,and the intensity distribution of the interfered frequency band.
基金supported by the Shandong Provincial Natural Science Foundation for Quantum Science under Grant No.ZR2021LLZ002the Fundamental Research Funds for the Central Universities under Grant No.22CX03005A。
文摘To solve the Poisson equation it is usually possible to discretize it into solving the corresponding linear system Ax=b.Variational quantum algorithms(VQAs)for the discretized Poisson equation have been studied before.We present a VQA based on the banded Toeplitz systems for solving the Poisson equation with respect to the structural features of matrix A.In detail,we decompose the matrices A and A^(2)into a linear combination of the corresponding banded Toeplitz matrix and sparse matrices with only a few non-zero elements.For the one-dimensional Poisson equation with different boundary conditions and the d-dimensional Poisson equation with Dirichlet boundary conditions,the number of decomposition terms is less than that reported in[Phys.Rev.A 2023108,032418].Based on the decomposition of the matrix,we design quantum circuits that efficiently evaluate the cost function.Additionally,numerical simulation verifies the feasibility of the proposed algorithm.Finally,the VQAs for linear systems of equations and matrix-vector multiplications with the K-banded Toeplitz matrix T_(n)^(K)are given,where T_(n)^(K)∈R^(n×n)and K∈O(ploylogn).
基金supported by the Research on theKey Technology of Damage Identification Method of Dam Concrete Structure based on Transformer Image Processing(242102521031)the project Research on Situational Awareness and Behavior Anomaly Prediction of Social Media Based on Multimodal Time Series Graph(232102520004)Key Scientific Research Project of Higher Education Institutions in Henan Province(25B520019).
文摘Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algorithm to achieve optimal results.In our proposed model,gamma correction and Retinex address color cast issues and enhance image edges and details.The final enhanced image is obtained through color balancing.The BES algorithm seeks the optimal solution through the selection,search,and swooping stages.However,it is prone to getting stuck in local optima and converges slowly.To overcome these limitations,we propose an improved BES algorithm(ABES)with enhanced population learning,position updates,and control parameters.ABES is employed to optimize the core parameters of gamma correction and Retinex to improve image quality,and the maximization of information entropy is utilized as the objective function.Real benchmark images are collected to validate its performance.Experimental results demonstrate that ABES outperforms the existing image enhancement methods,including the flower pollination algorithm,the chimp optimization algorithm,particle swarm optimization,and BES,in terms of information entropy,peak signal-to-noise ratio(PSNR),structural similarity index(SSIM),and patch-based contrast quality index(PCQI).ABES demonstrates superior performance both qualitatively and quantitatively,and it helps enhance prominent features and contrast in the images while maintaining the natural appearance of the original images.
基金Supported by the National Natural Science Foundation of China(under Grant Nos.12105090 and 12074107)the Program of Outstanding Young and Middle-aged Scientific and Technological Innovation Team of Colleges and Universities in Hubei Province of China(under Grant No.T2020001)the Innovation Group Project of the Natural Science Foundation of Hubei Province of China(under Grant No.2022CFA012)。
文摘Since the concept of quantum information masking was proposed by Modi et al(2018 Phys.Rev.Lett.120,230501),many interesting and significant results have been reported,both theoretically and experimentally.However,designing a quantum information masker is not an easy task,especially for larger systems.In this paper,we propose a variational quantum algorithm to resolve this problem.Specifically,our algorithm is a hybrid quantum-classical model,where the quantum device with adjustable parameters tries to mask quantum information and the classical device evaluates the performance of the quantum device and optimizes its parameters.After optimization,the quantum device behaves as an optimal masker.The loss value during optimization can be used to characterize the performance of the masker.In particular,if the loss value converges to zero,we obtain a perfect masker that completely masks the quantum information generated by the quantum information source,otherwise,the perfect masker does not exist and the subsystems always contain the original information.Nevertheless,these resulting maskers are still optimal.Quantum parallelism is utilized to reduce quantum state preparations and measurements.Our study paves the way for wide application of quantum information masking,and some of the techniques used in this study may have potential applications in quantum information processing.
基金supported by the Research Incentive Grant 23200 of Zayed University,United Arab Emirates.
文摘Cardiovascular disease prediction is a significant area of research in healthcare management systems(HMS).We will only be able to reduce the number of deaths if we anticipate cardiac problems in advance.The existing heart disease detection systems using machine learning have not yet produced sufficient results due to the reliance on available data.We present Clustered Butterfly Optimization Techniques(RoughK-means+BOA)as a new hybrid method for predicting heart disease.This method comprises two phases:clustering data using Roughk-means(RKM)and data analysis using the butterfly optimization algorithm(BOA).The benchmark dataset from the UCI repository is used for our experiments.The experiments are divided into three sets:the first set involves the RKM clustering technique,the next set evaluates the classification outcomes,and the last set validates the performance of the proposed hybrid model.The proposed RoughK-means+BOA has achieved a reasonable accuracy of 97.03 and a minimal error rate of 2.97.This result is comparatively better than other combinations of optimization techniques.In addition,this approach effectively enhances data segmentation,optimization,and classification performance.
基金supported by the National Natural Science Foundation of China(82225049,72104155)the Sichuan Provincial Central Government Guides Local Science and Technology Development Special Project(2022ZYD0127)the 1·3·5 Project for Disciplines of Excellence,West China Hospital,Sichuan University(ZYGD23004).
文摘Background:In recent years,there has been a growing trend in the utilization of observational studies that make use of routinely collected healthcare data(RCD).These studies rely on algorithms to identify specific health conditions(e.g.,diabetes or sepsis)for statistical analyses.However,there has been substantial variation in the algorithm development and validation,leading to frequently suboptimal performance and posing a significant threat to the validity of study findings.Unfortunately,these issues are often overlooked.Methods:We systematically developed guidance for the development,validation,and evaluation of algorithms designed to identify health status(DEVELOP-RCD).Our initial efforts involved conducting both a narrative review and a systematic review of published studies on the concepts and methodological issues related to algorithm development,validation,and evaluation.Subsequently,we conducted an empirical study on an algorithm for identifying sepsis.Based on these findings,we formulated specific workflow and recommendations for algorithm development,validation,and evaluation within the guidance.Finally,the guidance underwent independent review by a panel of 20 external experts who then convened a consensus meeting to finalize it.Results:A standardized workflow for algorithm development,validation,and evaluation was established.Guided by specific health status considerations,the workflow comprises four integrated steps:assessing an existing algorithm’s suitability for the target health status;developing a new algorithm using recommended methods;validating the algorithm using prescribed performance measures;and evaluating the impact of the algorithm on study results.Additionally,13 good practice recommendations were formulated with detailed explanations.Furthermore,a practical study on sepsis identification was included to demonstrate the application of this guidance.Conclusions:The establishment of guidance is intended to aid researchers and clinicians in the appropriate and accurate development and application of algorithms for identifying health status from RCD.This guidance has the potential to enhance the credibility of findings from observational studies involving RCD.
文摘Autism Spectrum Disorder(ASD)is a complex neurodevelopmental condition that causes multiple challenges in behavioral and communication activities.In the medical field,the data related to ASD,the security measures are integrated in this research responsibly and effectively to develop the Mobile Neuron Attention Stage-by-Stage Network(MNASNet)model,which is the integration of both Mobile Network(MobileNet)and Neuron Attention Stage-by-Stage.The steps followed to detect ASD with privacy-preserved data are data normalization,data augmentation,and K-Anonymization.The clinical data of individuals are taken initially and preprocessed using the Z-score Normalization.Then,data augmentation is performed using the oversampling technique.Subsequently,K-Anonymization is effectuated by utilizing the Black-winged Kite Algorithm to ensure the privacy of medical data,where the best fitness solution is based on data utility and privacy.Finally,after improving the data privacy,the developed approach MNASNet is implemented for ASD detection,which achieves highly accurate results compared to traditional methods to detect autism behavior.Hence,the final results illustrate that the proposed MNASNet achieves an accuracy of 92.9%,TPR of 95.9%,and TNR of 90.9%at the k-samples of 8.
基金supported by Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2024-00337489Development of Data Drift Management Technology to Overcome Performance Degradation of AI Analysis Models).
文摘As vehicular networks grow increasingly complex due to high node mobility and dynamic traffic conditions,efficient clustering mechanisms are vital to ensure stable and scalable communication.Recent studies have emphasized the need for adaptive clustering strategies to improve performance in Intelligent Transportation Systems(ITS).This paper presents the Grasshopper Optimization Algorithm for Vehicular Network Clustering(GOAVNET)algorithm,an innovative approach to optimal vehicular clustering in Vehicular Ad-Hoc Networks(VANETs),leveraging the Grasshopper Optimization Algorithm(GOA)to address the critical challenges of traffic congestion and communication inefficiencies in Intelligent Transportation Systems(ITS).The proposed GOA-VNET employs an iterative and interactive optimization mechanism to dynamically adjust node positions and cluster configurations,ensuring robust adaptability to varying vehicular densities and transmission ranges.Key features of GOA-VNET include the utilization of attraction zone,repulsion zone,and comfort zone parameters,which collectively enhance clustering efficiency and minimize congestion within Regions of Interest(ROI).By managing cluster configurations and node densities effectively,GOA-VNET ensures balanced load distribution and seamless data transmission,even in scenarios with high vehicular densities and varying transmission ranges.Comparative evaluations against the Whale Optimization Algorithm(WOA)and Grey Wolf Optimization(GWO)demonstrate that GOA-VNET consistently outperforms these methods by achieving superior clustering efficiency,reducing the number of clusters by up to 10%in high-density scenarios,and improving data transmission reliability.Simulation results reveal that under a 100-600 m transmission range,GOA-VNET achieves an average reduction of 8%-15%in the number of clusters and maintains a 5%-10%improvement in packet delivery ratio(PDR)compared to baseline algorithms.Additionally,the algorithm incorporates a heat transfer-inspired load-balancing mechanism,ensuring equitable distribution of nodes among cluster leaders(CLs)and maintaining a stable network environment.These results validate GOA-VNET as a reliable and scalable solution for VANETs,with significant potential to support next-generation ITS.Future research could further enhance the algorithm by integrating multi-objective optimization techniques and exploring broader applications in complex traffic scenarios.
文摘The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.
文摘The objective of this study is to develop an advanced approach to variogram modelling by integrating genetic algorithms(GA)with machine learning-based linear regression,aiming to improve the accuracy and efficiency of geostatistical analysis,particularly in mineral exploration.The study combines GA and machine learning to optimise variogram parameters,including range,sill,and nugget,by minimising the root mean square error(RMSE)and maximising the coefficient of determination(R^(2)).The experimental variograms were computed and modelled using theoretical models,followed by optimisation via evolutionary algorithms.The method was applied to gravity data from the Ngoura-Batouri-Kette mining district in Eastern Cameroon,covering 141 data points.Sequential Gaussian Simulations(SGS)were employed for predictive mapping to validate simulated results against true values.Key findings show variograms with ranges between 24.71 km and 49.77 km,opti-mised RMSE and R^(2) values of 11.21 mGal^(2) and 0.969,respectively,after 42 generations of GA optimisation.Predictive mapping using SGS demonstrated that simulated values closely matched true values,with the simu-lated mean at 21.75 mGal compared to the true mean of 25.16 mGal,and variances of 465.70 mGal^(2) and 555.28 mGal^(2),respectively.The results confirmed spatial variability and anisotropies in the N170-N210 directions,consistent with prior studies.This work presents a novel integration of GA and machine learning for variogram modelling,offering an automated,efficient approach to parameter estimation.The methodology significantly enhances predictive geostatistical models,contributing to the advancement of mineral exploration and improving the precision and speed of decision-making in the petroleum and mining industries.
基金supported by the Shan⁃dong Provincial Natural Science Foundation(No.ZR2022MF314).
文摘The adaptive filtering algorithm with a fixed projection order is unable to adjust its performance in response to changes in the external environment of airborne radars.To overcome this limitation,a new approach is introduced,which is the variable projection order Ekblom norm-promoted adaptive algorithm(VPO-EPAA).The method begins by examining the mean squared deviation(MSD)of the EPAA,deriving a formula for its MSD.Next,it compares the MSD of EPAA at two different projection orders and selects the one that minimizes the MSD as the parameter for the current iteration.Furthermore,the algorithm’s computational complexity is analyzed theoretically.Simulation results from system identification and self-interference cancellation show that the proposed algorithm performs exceptionally well in airborne radar signal self-interference cancellation,even under various noise intensities and types of interference.
基金Supported by the Optimisation Theory and Algorithm Research Team(Grant No.23kytdzd004)University Science Research Project of Anhui Province(Grant No.2024AH050631)the General Programs for Young Teacher Cultivation of Educational Commission of Anhui Province(Grant No.YQYB2023090).
文摘In this paper,we propose a new full-Newton step feasible interior-point algorithm for the special weighted linear complementarity problems.The proposed algorithm employs the technique of algebraic equivalent transformation to derive the search direction.It is shown that the proximity measure reduces quadratically at each iteration.Moreover,the iteration bound of the algorithm is as good as the best-known polynomial complexity for these types of problems.Furthermore,numerical results are presented to show the efficiency of the proposed algorithm.
基金funded by King Fahd University of Petroleum&Minerals,Saudi Arabia under IRC-SES grant#INRE 2217.
文摘Wind energy has emerged as a potential replacement for fossil fuel-based energy sources.To harness maximum wind energy,a crucial decision in the development of an efficient wind farm is the optimal layout design.This layout defines the specific locations of the turbines within the wind farm.The process of finding the optimal locations of turbines,in the presence of various technical and technological constraints,makes the wind farm layout design problem a complex optimization problem.This problem has traditionally been solved with nature-inspired algorithms with promising results.The performance and convergence of nature-inspired algorithms depend on several parameters,among which the algorithm termination criterion plays a crucial role.Timely convergence is an important aspect of efficient algorithm design because an inefficient algorithm results in wasted computational resources,unwarranted electricity consumption,and hardware stress.This study provides an in-depth analysis of several termination criteria while using the genetic algorithm as a test bench,with its application to the wind farm layout design problem while considering various wind scenarios.The performance of six termination criteria is empirically evaluated with respect to the quality of solutions produced and the execution time involved.Due to the conflicting nature of these two attributes,fuzzy logic-based multi-attribute decision-making is employed in the decision process.Results for the fuzzy decision approach indicate that among the various criteria tested,the criterion Phi achieves an improvement in the range of 2.44%to 32.93%for wind scenario 1.For scenario 2,Best-worst termination criterion performed well compared to the other criteria evaluated,with an improvement in the range of 1.2%to 9.64%.For scenario 3,Hitting bound was the best performer with an improvement of 1.16%to 20.93%.
基金the support of the National Natural Science Foundation of China(No.52272382)the Aeronautical Science Foundation of China(No.20200017051001)the Fundamental Research Funds for the Central Universities,China。
文摘For autonomous Unmanned Aerial Vehicles(UAVs)flying in real-world scenarios,time for path planning is always limited,which is a challenge known as the anytime problem.Anytime planners address this by finding a collision-free path quickly and then improving it until time runs out,making UAVs more adaptable to different mission scenarios.However,current anytime algorithms based on A^(*)have insufficient control over the suboptimality bounds of paths and tend to lose their anytime properties in environments with large concave obstacles.This paper proposes a novel anytime path planning algorithm,Anytime Radiation A^(*)(ARa A^(*)),which can generate a series of suboptimal paths with improved bounds through decreasing search step sizes and can generate the optimal path when time is sufficient.The ARa A^(*)features two main innovations:an adaptive variable-step-size mechanism and elliptic constraints based on waypoints.The former helps achieve fast path searching in various environments.The latter allows ARa A^(*)to control the suboptimality bounds of paths and further enhance search efficiency.Simulation experiments show that the ARa A^(*)outperforms Anytime Repairing A^(*)(ARA^(*))and Anytime D^(*)(AD^(*))in controlling suboptimality bounds and planning time,especially in environments with large concave obstacles.Final flight experiments demonstrate that the paths planned by ARa A^(*)can ensure the safe flight of quadrotors.
文摘BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.
基金supported by the National Key Research and Development Program of China(2023YFB3307801)the National Natural Science Foundation of China(62394343,62373155,62073142)+3 种基金Major Science and Technology Project of Xinjiang(No.2022A01006-4)the Programme of Introducing Talents of Discipline to Universities(the 111 Project)under Grant B17017the Fundamental Research Funds for the Central Universities,Science Foundation of China University of Petroleum,Beijing(No.2462024YJRC011)the Open Research Project of the State Key Laboratory of Industrial Control Technology,China(Grant No.ICT2024B70).
文摘The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficiency of process optimization or monitoring studies.However,the distillation process is highly nonlinear and has multiple uncertainty perturbation intervals,which brings challenges to accurate data-driven modelling of distillation processes.This paper proposes a systematic data-driven modelling framework to solve these problems.Firstly,data segment variance was introduced into the K-means algorithm to form K-means data interval(KMDI)clustering in order to cluster the data into perturbed and steady state intervals for steady-state data extraction.Secondly,maximal information coefficient(MIC)was employed to calculate the nonlinear correlation between variables for removing redundant features.Finally,extreme gradient boosting(XGBoost)was integrated as the basic learner into adaptive boosting(AdaBoost)with the error threshold(ET)set to improve weights update strategy to construct the new integrated learning algorithm,XGBoost-AdaBoost-ET.The superiority of the proposed framework is verified by applying this data-driven modelling framework to a real industrial process of propylene distillation.