The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is h...The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is highly expensive,we will develop genetic algorithms(GAs)to obtain heuristic solutions to the problem.In GAs,as the crossover is a very important process,the crossovermethods proposed for the traditional TSP could be adapted for the GTSP.The sequential constructive crossover(SCX)and three other operators are adapted to use in GAs to solve the GTSP.The effectiveness of GA using SCX is verified on some GTSP Library(GTSPLIB)instances first and then compared against GAs using the other crossover methods.The computational results show the success of the GA using SCX for this problem.Our proposed GA using SCX,and swap mutation could find average solutions whose average percentage of excesses fromthe best-known solutions is between 0.00 and 14.07 for our investigated instances.展开更多
This paper presents a novel method for reconstructing a highly accurate 3D nose model of the human from 2D images and pre-marked landmarks based on algorithmic methods.The study focuses on the reconstruction of a 3D n...This paper presents a novel method for reconstructing a highly accurate 3D nose model of the human from 2D images and pre-marked landmarks based on algorithmic methods.The study focuses on the reconstruction of a 3D nose model tailored for applications in healthcare and cosmetic surgery.The approach leverages advanced image processing techniques,3D Morphable Models(3DMM),and deformation techniques to overcome the limita-tions of deep learning models,particularly addressing the interpretability issues commonly encountered in medical applications.The proposed method estimates the 3D coordinates of landmark points using a 3D structure estimation algorithm.Sub-landmarks are extracted through image processing techniques and interpolation.The initial surface is generated using a 3DMM,though its accuracy remains limited.To enhance precision,deformation techniques are applied,utilizing the coordinates of 76 identified landmarks and sub-landmarks.The resulting 3D nose model is constructed based on algorithmic methods and pre-marked landmarks.Evaluation of the 3D model is conducted by comparing landmark distances and shape similarity with expert-determined ground truth on 30 Vietnamese volunteers aged 18 to 47,all of whom were either preparing for or required nasal surgery.Experimental results demonstrate a strong agreement between the reconstructed 3D model and the ground truth.The method achieved a mean landmark distance error of 0.631 mm and a shape error of 1.738 mm,demonstrating its potential for medical applications.展开更多
BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches....BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.展开更多
This study investigates how artificial intelligence(AI)algorithms enable mainstream media to achieve precise emotional matching and improve communication efficiency through reconstructed communication logic.As digital...This study investigates how artificial intelligence(AI)algorithms enable mainstream media to achieve precise emotional matching and improve communication efficiency through reconstructed communication logic.As digital intelligence technology rapidly evolves,mainstream media organizations are increasingly leveraging AI-driven empathy algorithms to enhance audience engagement and optimize content delivery.This research employs a mixed-methods approach,combining quantitative analysis of algorithmic performance metrics with qualitative examination of media communication patterns.Through systematic review of 150 academic papers and analysis of data from 12 major media platforms,this study reveals that algorithmic empathy systems can improve emotional resonance by 34.7%and increase audience engagement by 28.3%compared to traditional communication methods.The findings demonstrate that AI algorithms reconstruct media communication logic through three primary pathways:emotional pattern recognition,personalized content curation,and real-time sentiment adaptation.However,the study also identifies significant challenges including algorithmic bias,emotional authenticity concerns,and ethical implications of automated empathy.The research contributes to understanding how mainstream media can leverage AI technology to build high-quality empathetic communication while maintaining journalistic integrity and social responsibility.展开更多
Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a ...Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a surgical algorithm that addresses the appropriate surgical procedures for different types of nasal alar defects in Asian patients.Methods:A retrospective case note review was conducted on 32 patients with nasal alar defect who underwent reconstruction between 2008 and 2022.Based on careful analysis and our clinical experience,we proposed a classification system for nasal alar defects and presented a reconstructive algorithm.Patient data,including age,sex,diagnosis,surgical options,and complications,were assessed.The extent of surgical scar formation was evaluated using standard photography based on a 4-grade scar scale.Results:Among the 32 patients,there were 20 males and 12 females with nasal alar defects.The predominant cause of trauma in China was industrial factors.The majority of alar defects were classified as type Ⅰ C(n=8,25%),comprising 18 cases(56.2%);there were 5 cases(15.6%)of type Ⅱ defect,7(21.9%)of type Ⅲ defect,and 2(6.3%)of type Ⅳ defect.The most common surgical option was auricular composite graft(n=8,25%),followed by bilobed flap(n=6,18.8%),free auricular composite flap(n=4,12.5%),and primary closure(n=3,9.4%).Satisfactory improvements were observed postoperatively.Conclusion:Factors contributing to classifications were analyzed and defined,providing a framework for the proposed classification system.The reconstructive algorithm offers surgeons appropriate procedures for treating nasal alar defect in Asians.展开更多
BACKGROUND Difficulty of colonoscopy insertion(DCI)significantly affects colonoscopy effectiveness and serves as a key quality indicator.Predicting and evaluating DCI risk preoperatively is crucial for optimizing intr...BACKGROUND Difficulty of colonoscopy insertion(DCI)significantly affects colonoscopy effectiveness and serves as a key quality indicator.Predicting and evaluating DCI risk preoperatively is crucial for optimizing intraoperative strategies.AIM To evaluate the predictive performance of machine learning(ML)algorithms for DCI by comparing three modeling approaches,identify factors influencing DCI,and develop a preoperative prediction model using ML algorithms to enhance colonoscopy quality and efficiency.METHODS This cross-sectional study enrolled 712 patients who underwent colonoscopy at a tertiary hospital between June 2020 and May 2021.Demographic data,past medical history,medication use,and psychological status were collected.The endoscopist assessed DCI using the visual analogue scale.After univariate screening,predictive models were developed using multivariable logistic regression,least absolute shrinkage and selection operator(LASSO)regression,and random forest(RF)algorithms.Model performance was evaluated based on discrimination,calibration,and decision curve analysis(DCA),and results were visualized using nomograms.RESULTS A total of 712 patients(53.8%male;mean age 54.5 years±12.9 years)were included.Logistic regression analysis identified constipation[odds ratio(OR)=2.254,95%confidence interval(CI):1.289-3.931],abdominal circumference(AC)(77.5–91.9 cm,OR=1.895,95%CI:1.065-3.350;AC≥92 cm,OR=1.271,95%CI:0.730-2.188),and anxiety(OR=1.071,95%CI:1.044-1.100)as predictive factors for DCI,validated by LASSO and RF methods.Model performance revealed training/validation sensitivities of 0.826/0.925,0.924/0.868,and 1.000/0.981;specificities of 0.602/0.511,0.510/0.562,and 0.977/0.526;and corresponding area under the receiver operating characteristic curves(AUCs)of 0.780(0.737-0.823)/0.726(0.654-0.799),0.754(0.710-0.798)/0.723(0.656-0.791),and 1.000(1.000-1.000)/0.754(0.688-0.820),respectively.DCA indicated optimal net benefit within probability thresholds of 0-0.9 and 0.05-0.37.The RF model demonstrated superior diagnostic accuracy,reflected by perfect training sensitivity(1.000)and highest validation AUC(0.754),outperforming other methods in clinical applicability.CONCLUSION The RF-based model exhibited superior predictive accuracy for DCI compared to multivariable logistic and LASSO regression models.This approach supports individualized preoperative optimization,enhancing colonoscopy quality through targeted risk stratification.展开更多
During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential ...During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential floating will increase the initial stress on the segments and bolts which is harmful to the service performance of the tunnel.In this study we used a random forest(RF)algorithm combined particle swarm optimization(PSO)and 5-fold cross-validation(5-fold CV)to predict the maximum upward displacement of tunnel linings induced by shield tunnel excavation.The mechanism and factors causing upward movement of the tunnel lining are comprehensively summarized.Twelve input variables were selected according to results from analysis of influencing factors.The prediction performance of two models,PSO-RF and RF(default)were compared.The Gini value was obtained to represent the relative importance of the influencing factors to the upward displacement of linings.The PSO-RF model successfully predicted the maximum upward displacement of the tunnel linings with a low error(mean absolute error(MAE)=4.04 mm,root mean square error(RMSE)=5.67 mm)and high correlation(R^(2)=0.915).The thrust and depth of the tunnel were the most important factors in the prediction model influencing the upward displacement of the tunnel linings.展开更多
Thank you to the Asian Journal of Urology(AJU)for the honor of allowing me to be the vip editor for this special focus section on robotic urinary tract reconstruction.This topic has been a large focus for me in my c...Thank you to the Asian Journal of Urology(AJU)for the honor of allowing me to be the vip editor for this special focus section on robotic urinary tract reconstruction.This topic has been a large focus for me in my career;in my pursuit of knowledge in this new sub-field of urology,I have been so fortunate to have met so many talented surgeons who share a similar passion.The urinary tract spans a large anatomical region,and due to the large variety of conditions that affect it,an endless variety of functional and structural urologic problems can arise.Urologists have always been adept surgeons capable of operating in various anatomical spaces and have embraced technological innovation.Historically,the trend has moved from open surgery to endoscopic treatment;however,many patients with reconstructive needs remain untreated or sub-optimally managed.展开更多
To fully leverage the advantages of mechanization and informatization in tunnel boring machine(TBM)operations,the authors aim to promote the advancement of tunnel construction technology toward intelligent development...To fully leverage the advantages of mechanization and informatization in tunnel boring machine(TBM)operations,the authors aim to promote the advancement of tunnel construction technology toward intelligent development.This involved exploring the deep integration of next-generation artificial intelligence technologies,such as sensing technology,automatic control technology,big data technology,deep learning,and machine vision,with key operational processes,including TBM excavation,direction adjustment,step changes,inverted arch block assembly,material transportation,and operation status assurance.The results of this integration are summarized as follows.(1)TBM key excavation parameter prediction algorithm was developed with an accuracy rate exceeding 90%.The TBM intelligent step-change control algorithm,based on machine vision,achieved an image segmentation accuracy rate of 95%and gripper shoe positioning error of±5 mm.(2)An automatic positioning system for inverted arch blocks was developed,enabling real-time perception of the spatial position and deviation during the assembly process.The system maintains an elevation positioning deviation within±3 mm and a horizontal positioning deviation within±10 mm,reducing the number of surveyors in each work team.(3)A TBM intelligent rail transportation system that achieves real-time human-machine positioning,automatic switch opening and closing,automatic obstacle avoidance,intelligent transportation planning,and integrated scheduling and command was designed.Each locomotive formation reduces one shunter and improves comprehensive transportation efficiency by more than 20%.(4)Intelligent analysis and prediction algorithms were developed to monitor and predict the trends of the hydraulic and gear oil parameters in real time,enhancing the proactive maintenance and system reliability.展开更多
In order to improve the problems that the minimum hamming weight(MHW) of the polar codes of the traditional Gaussian approximation(GA) construction is small and its performance is not good enough, an improved channel ...In order to improve the problems that the minimum hamming weight(MHW) of the polar codes of the traditional Gaussian approximation(GA) construction is small and its performance is not good enough, an improved channel construction algorithm of polar codes based on frozen bits is proposed by combining the construction of the Reed-Muller(RM) code to effectively increase the MHW and analyzing the correcting and checking functions of the frozen bits in the successive cancellation list(SCL) decoding.展开更多
The low-density imaging performance of a zone plate-based nano-resolution hard x-ray computed tomography(CT)system can be significantly improved by incorporating a grating-based Lau interferometer. Due to the diffract...The low-density imaging performance of a zone plate-based nano-resolution hard x-ray computed tomography(CT)system can be significantly improved by incorporating a grating-based Lau interferometer. Due to the diffraction, however,the acquired nano-resolution phase signal may suffer splitting problem, which impedes the direct reconstruction of phase contrast CT(nPCT) images. To overcome, a new model-driven nPCT image reconstruction algorithm is developed in this study. In it, the diffraction procedure is mathematically modeled into a matrix B, from which the projections without signal splitting can be generated invertedly. Furthermore, a penalized weighted least-square model with total variation(PWLSTV) is employed to denoise these projections, from which nPCT images with high accuracy are directly reconstructed.Numerical experiments demonstrate that this new algorithm is able to work with phase projections having any splitting distances. Moreover, results also reveal that nPCT images of higher signal-to-noise-ratio(SNR) could be reconstructed from projections having larger splitting distances. In summary, a novel model-driven nPCT image reconstruction algorithm with high accuracy and robustness is verified for the Lau interferometer-based hard x-ray nano-resolution phase contrast imaging.展开更多
In the data-driven era of the internet and business environments,constructing accurate user profiles is paramount for personalized user understanding and classification.The traditional TF-IDF algorithm has some limita...In the data-driven era of the internet and business environments,constructing accurate user profiles is paramount for personalized user understanding and classification.The traditional TF-IDF algorithm has some limitations when evaluating the impact of words on classification results.Consequently,an improved TF-IDF-K algorithm was introduced in this study,which included an equalization factor,aimed at constructing user profiles by processing and analyzing user search records.Through the training and prediction capabilities of a Support Vector Machine(SVM),it enabled the prediction of user demographic attributes.The experimental results demonstrated that the TF-IDF-K algorithm has achieved a significant improvement in classification accuracy and reliability.展开更多
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr...Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.展开更多
Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently...Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms.展开更多
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th...Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.展开更多
The intersection of the Industrial Internet of Things(IIoT)and artificial intelligence(AI)has garnered ever-increasing attention and research interest.Nevertheless,the dilemma between the strict resource-constrained n...The intersection of the Industrial Internet of Things(IIoT)and artificial intelligence(AI)has garnered ever-increasing attention and research interest.Nevertheless,the dilemma between the strict resource-constrained nature of IIoT devices and the extensive resource demands of AI has not yet been fully addressed with a comprehensive solution.Taking advantage of the lightweight constructive neural network(LightGCNet)in developing fast learner models for IIoT,a convex geometric constructive neural network with a low-complexity control strategy,namely,ConGCNet,is proposed in this article via convex optimization and matrix theory,which enhances the convergence rate and reduces the computational consumption in comparison with LightGCNet.Firstly,a low-complexity control strategy is proposed to reduce the computational consumption during the hidden parameters training process.Secondly,a novel output weights evaluated method based on convex optimization is proposed to guarantee the convergence rate.Finally,the universal approximation property of ConGCNet is proved by the low-complexity control strategy and convex output weights evaluated method.Simulation results,including four benchmark datasets and the real-world ore grinding process,demonstrate that ConGCNet effectively reduces computational consumption in the modelling process and improves the model’s convergence rate.展开更多
The scheduling of construction equipment is a means to realize network planning.With the large-scale and low-cost requirements of engineering construction,the cooperation among members of the engineering supply chain ...The scheduling of construction equipment is a means to realize network planning.With the large-scale and low-cost requirements of engineering construction,the cooperation among members of the engineering supply chain has become very important,and effective coordination of project plans at all levels to optimize the resource management and scheduling of a project is helpful to reduce project duration and cost.In this paper,under the milestone constraint conditions,the scheduling problems of multiple construction devices in the same sequence of operation were described and hypothesized mathematically,and the scheduling models of multiple equipment were established.The Palmer algorithm,CDS algorithm and Gupta algorithm were respectively used to solve the optimal scheduling of construction equipment to achieve the optimization of the construction period.The optimization scheduling of a single construction device and multiple construction devices was solved by using sequencing theory under milestone constraint,and these methods can obtain reasonable results,which has important guiding significance for the scheduling of construction equipment.展开更多
Aiming at the flexible flowshop group scheduling problem,taking sequence dependent setup time and machine skipping into account, a mathematical model for minimizing makespan is established,and a hybrid differential ev...Aiming at the flexible flowshop group scheduling problem,taking sequence dependent setup time and machine skipping into account, a mathematical model for minimizing makespan is established,and a hybrid differential evolution( HDE) algorithm based on greedy constructive procedure( GCP) is proposed,which combines differential evolution( DE) with tabu search( TS). DE is applied to generating the elite individuals of population,while TS is used for finding the optimal value by making perturbation in selected elite individuals. A lower bounding technique is developed to evaluate the quality of proposed algorithm. Experimental results verify the effectiveness and feasibility of proposed algorithm.展开更多
In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms...In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set.展开更多
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t...In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching.展开更多
基金the Deanship of Scientific Research,Imam Mohammad Ibn Saud Islamic University(IMSIU),Saudi Arabia,for funding this research work through Grant No.(221412020).
文摘The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is highly expensive,we will develop genetic algorithms(GAs)to obtain heuristic solutions to the problem.In GAs,as the crossover is a very important process,the crossovermethods proposed for the traditional TSP could be adapted for the GTSP.The sequential constructive crossover(SCX)and three other operators are adapted to use in GAs to solve the GTSP.The effectiveness of GA using SCX is verified on some GTSP Library(GTSPLIB)instances first and then compared against GAs using the other crossover methods.The computational results show the success of the GA using SCX for this problem.Our proposed GA using SCX,and swap mutation could find average solutions whose average percentage of excesses fromthe best-known solutions is between 0.00 and 14.07 for our investigated instances.
文摘This paper presents a novel method for reconstructing a highly accurate 3D nose model of the human from 2D images and pre-marked landmarks based on algorithmic methods.The study focuses on the reconstruction of a 3D nose model tailored for applications in healthcare and cosmetic surgery.The approach leverages advanced image processing techniques,3D Morphable Models(3DMM),and deformation techniques to overcome the limita-tions of deep learning models,particularly addressing the interpretability issues commonly encountered in medical applications.The proposed method estimates the 3D coordinates of landmark points using a 3D structure estimation algorithm.Sub-landmarks are extracted through image processing techniques and interpolation.The initial surface is generated using a 3DMM,though its accuracy remains limited.To enhance precision,deformation techniques are applied,utilizing the coordinates of 76 identified landmarks and sub-landmarks.The resulting 3D nose model is constructed based on algorithmic methods and pre-marked landmarks.Evaluation of the 3D model is conducted by comparing landmark distances and shape similarity with expert-determined ground truth on 30 Vietnamese volunteers aged 18 to 47,all of whom were either preparing for or required nasal surgery.Experimental results demonstrate a strong agreement between the reconstructed 3D model and the ground truth.The method achieved a mean landmark distance error of 0.631 mm and a shape error of 1.738 mm,demonstrating its potential for medical applications.
文摘BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.
文摘This study investigates how artificial intelligence(AI)algorithms enable mainstream media to achieve precise emotional matching and improve communication efficiency through reconstructed communication logic.As digital intelligence technology rapidly evolves,mainstream media organizations are increasingly leveraging AI-driven empathy algorithms to enhance audience engagement and optimize content delivery.This research employs a mixed-methods approach,combining quantitative analysis of algorithmic performance metrics with qualitative examination of media communication patterns.Through systematic review of 150 academic papers and analysis of data from 12 major media platforms,this study reveals that algorithmic empathy systems can improve emotional resonance by 34.7%and increase audience engagement by 28.3%compared to traditional communication methods.The findings demonstrate that AI algorithms reconstruct media communication logic through three primary pathways:emotional pattern recognition,personalized content curation,and real-time sentiment adaptation.However,the study also identifies significant challenges including algorithmic bias,emotional authenticity concerns,and ethical implications of automated empathy.The research contributes to understanding how mainstream media can leverage AI technology to build high-quality empathetic communication while maintaining journalistic integrity and social responsibility.
文摘Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a surgical algorithm that addresses the appropriate surgical procedures for different types of nasal alar defects in Asian patients.Methods:A retrospective case note review was conducted on 32 patients with nasal alar defect who underwent reconstruction between 2008 and 2022.Based on careful analysis and our clinical experience,we proposed a classification system for nasal alar defects and presented a reconstructive algorithm.Patient data,including age,sex,diagnosis,surgical options,and complications,were assessed.The extent of surgical scar formation was evaluated using standard photography based on a 4-grade scar scale.Results:Among the 32 patients,there were 20 males and 12 females with nasal alar defects.The predominant cause of trauma in China was industrial factors.The majority of alar defects were classified as type Ⅰ C(n=8,25%),comprising 18 cases(56.2%);there were 5 cases(15.6%)of type Ⅱ defect,7(21.9%)of type Ⅲ defect,and 2(6.3%)of type Ⅳ defect.The most common surgical option was auricular composite graft(n=8,25%),followed by bilobed flap(n=6,18.8%),free auricular composite flap(n=4,12.5%),and primary closure(n=3,9.4%).Satisfactory improvements were observed postoperatively.Conclusion:Factors contributing to classifications were analyzed and defined,providing a framework for the proposed classification system.The reconstructive algorithm offers surgeons appropriate procedures for treating nasal alar defect in Asians.
基金the Chinese Clinical Trial Registry(No.ChiCTR2000040109)approved by the Hospital Ethics Committee(No.20210130017).
文摘BACKGROUND Difficulty of colonoscopy insertion(DCI)significantly affects colonoscopy effectiveness and serves as a key quality indicator.Predicting and evaluating DCI risk preoperatively is crucial for optimizing intraoperative strategies.AIM To evaluate the predictive performance of machine learning(ML)algorithms for DCI by comparing three modeling approaches,identify factors influencing DCI,and develop a preoperative prediction model using ML algorithms to enhance colonoscopy quality and efficiency.METHODS This cross-sectional study enrolled 712 patients who underwent colonoscopy at a tertiary hospital between June 2020 and May 2021.Demographic data,past medical history,medication use,and psychological status were collected.The endoscopist assessed DCI using the visual analogue scale.After univariate screening,predictive models were developed using multivariable logistic regression,least absolute shrinkage and selection operator(LASSO)regression,and random forest(RF)algorithms.Model performance was evaluated based on discrimination,calibration,and decision curve analysis(DCA),and results were visualized using nomograms.RESULTS A total of 712 patients(53.8%male;mean age 54.5 years±12.9 years)were included.Logistic regression analysis identified constipation[odds ratio(OR)=2.254,95%confidence interval(CI):1.289-3.931],abdominal circumference(AC)(77.5–91.9 cm,OR=1.895,95%CI:1.065-3.350;AC≥92 cm,OR=1.271,95%CI:0.730-2.188),and anxiety(OR=1.071,95%CI:1.044-1.100)as predictive factors for DCI,validated by LASSO and RF methods.Model performance revealed training/validation sensitivities of 0.826/0.925,0.924/0.868,and 1.000/0.981;specificities of 0.602/0.511,0.510/0.562,and 0.977/0.526;and corresponding area under the receiver operating characteristic curves(AUCs)of 0.780(0.737-0.823)/0.726(0.654-0.799),0.754(0.710-0.798)/0.723(0.656-0.791),and 1.000(1.000-1.000)/0.754(0.688-0.820),respectively.DCA indicated optimal net benefit within probability thresholds of 0-0.9 and 0.05-0.37.The RF model demonstrated superior diagnostic accuracy,reflected by perfect training sensitivity(1.000)and highest validation AUC(0.754),outperforming other methods in clinical applicability.CONCLUSION The RF-based model exhibited superior predictive accuracy for DCI compared to multivariable logistic and LASSO regression models.This approach supports individualized preoperative optimization,enhancing colonoscopy quality through targeted risk stratification.
基金supported by the Basic Science Center Program for Multiphase Evolution in Hyper Gravity of the National Natural Science Foundation of China(No.51988101)the National Natural Science Foundation of China(No.52178306)the Zhejiang Provincial Natural Science Foundation of China(No.LR19E080002).
文摘During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential floating will increase the initial stress on the segments and bolts which is harmful to the service performance of the tunnel.In this study we used a random forest(RF)algorithm combined particle swarm optimization(PSO)and 5-fold cross-validation(5-fold CV)to predict the maximum upward displacement of tunnel linings induced by shield tunnel excavation.The mechanism and factors causing upward movement of the tunnel lining are comprehensively summarized.Twelve input variables were selected according to results from analysis of influencing factors.The prediction performance of two models,PSO-RF and RF(default)were compared.The Gini value was obtained to represent the relative importance of the influencing factors to the upward displacement of linings.The PSO-RF model successfully predicted the maximum upward displacement of the tunnel linings with a low error(mean absolute error(MAE)=4.04 mm,root mean square error(RMSE)=5.67 mm)and high correlation(R^(2)=0.915).The thrust and depth of the tunnel were the most important factors in the prediction model influencing the upward displacement of the tunnel linings.
文摘Thank you to the Asian Journal of Urology(AJU)for the honor of allowing me to be the vip editor for this special focus section on robotic urinary tract reconstruction.This topic has been a large focus for me in my career;in my pursuit of knowledge in this new sub-field of urology,I have been so fortunate to have met so many talented surgeons who share a similar passion.The urinary tract spans a large anatomical region,and due to the large variety of conditions that affect it,an endless variety of functional and structural urologic problems can arise.Urologists have always been adept surgeons capable of operating in various anatomical spaces and have embraced technological innovation.Historically,the trend has moved from open surgery to endoscopic treatment;however,many patients with reconstructive needs remain untreated or sub-optimally managed.
文摘To fully leverage the advantages of mechanization and informatization in tunnel boring machine(TBM)operations,the authors aim to promote the advancement of tunnel construction technology toward intelligent development.This involved exploring the deep integration of next-generation artificial intelligence technologies,such as sensing technology,automatic control technology,big data technology,deep learning,and machine vision,with key operational processes,including TBM excavation,direction adjustment,step changes,inverted arch block assembly,material transportation,and operation status assurance.The results of this integration are summarized as follows.(1)TBM key excavation parameter prediction algorithm was developed with an accuracy rate exceeding 90%.The TBM intelligent step-change control algorithm,based on machine vision,achieved an image segmentation accuracy rate of 95%and gripper shoe positioning error of±5 mm.(2)An automatic positioning system for inverted arch blocks was developed,enabling real-time perception of the spatial position and deviation during the assembly process.The system maintains an elevation positioning deviation within±3 mm and a horizontal positioning deviation within±10 mm,reducing the number of surveyors in each work team.(3)A TBM intelligent rail transportation system that achieves real-time human-machine positioning,automatic switch opening and closing,automatic obstacle avoidance,intelligent transportation planning,and integrated scheduling and command was designed.Each locomotive formation reduces one shunter and improves comprehensive transportation efficiency by more than 20%.(4)Intelligent analysis and prediction algorithms were developed to monitor and predict the trends of the hydraulic and gear oil parameters in real time,enhancing the proactive maintenance and system reliability.
基金supported by the National Natural Science Foundation of China(Nos.U21A20447 and 61971079)。
文摘In order to improve the problems that the minimum hamming weight(MHW) of the polar codes of the traditional Gaussian approximation(GA) construction is small and its performance is not good enough, an improved channel construction algorithm of polar codes based on frozen bits is proposed by combining the construction of the Reed-Muller(RM) code to effectively increase the MHW and analyzing the correcting and checking functions of the frozen bits in the successive cancellation list(SCL) decoding.
基金Project supported by the National Natural Science Foundation of China(Grant No.12027812)the Guangdong Basic and Applied Basic Research Foundation of Guangdong Province,China(Grant No.2021A1515111031)。
文摘The low-density imaging performance of a zone plate-based nano-resolution hard x-ray computed tomography(CT)system can be significantly improved by incorporating a grating-based Lau interferometer. Due to the diffraction, however,the acquired nano-resolution phase signal may suffer splitting problem, which impedes the direct reconstruction of phase contrast CT(nPCT) images. To overcome, a new model-driven nPCT image reconstruction algorithm is developed in this study. In it, the diffraction procedure is mathematically modeled into a matrix B, from which the projections without signal splitting can be generated invertedly. Furthermore, a penalized weighted least-square model with total variation(PWLSTV) is employed to denoise these projections, from which nPCT images with high accuracy are directly reconstructed.Numerical experiments demonstrate that this new algorithm is able to work with phase projections having any splitting distances. Moreover, results also reveal that nPCT images of higher signal-to-noise-ratio(SNR) could be reconstructed from projections having larger splitting distances. In summary, a novel model-driven nPCT image reconstruction algorithm with high accuracy and robustness is verified for the Lau interferometer-based hard x-ray nano-resolution phase contrast imaging.
文摘In the data-driven era of the internet and business environments,constructing accurate user profiles is paramount for personalized user understanding and classification.The traditional TF-IDF algorithm has some limitations when evaluating the impact of words on classification results.Consequently,an improved TF-IDF-K algorithm was introduced in this study,which included an equalization factor,aimed at constructing user profiles by processing and analyzing user search records.Through the training and prediction capabilities of a Support Vector Machine(SVM),it enabled the prediction of user demographic attributes.The experimental results demonstrated that the TF-IDF-K algorithm has achieved a significant improvement in classification accuracy and reliability.
基金supported by the National Natural Science Foundation of China(NSFC)under Grant(No.51677058).
文摘Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.
基金National Natural Science Foundation of China(11971211,12171388).
文摘Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms.
基金supported by Yunnan Provincial Basic Research Project(202401AT070344,202301AT070443)National Natural Science Foundation of China(62263014,52207105)+1 种基金Yunnan Lancang-Mekong International Electric Power Technology Joint Laboratory(202203AP140001)Major Science and Technology Projects in Yunnan Province(202402AG050006).
文摘Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy.
文摘The intersection of the Industrial Internet of Things(IIoT)and artificial intelligence(AI)has garnered ever-increasing attention and research interest.Nevertheless,the dilemma between the strict resource-constrained nature of IIoT devices and the extensive resource demands of AI has not yet been fully addressed with a comprehensive solution.Taking advantage of the lightweight constructive neural network(LightGCNet)in developing fast learner models for IIoT,a convex geometric constructive neural network with a low-complexity control strategy,namely,ConGCNet,is proposed in this article via convex optimization and matrix theory,which enhances the convergence rate and reduces the computational consumption in comparison with LightGCNet.Firstly,a low-complexity control strategy is proposed to reduce the computational consumption during the hidden parameters training process.Secondly,a novel output weights evaluated method based on convex optimization is proposed to guarantee the convergence rate.Finally,the universal approximation property of ConGCNet is proved by the low-complexity control strategy and convex output weights evaluated method.Simulation results,including four benchmark datasets and the real-world ore grinding process,demonstrate that ConGCNet effectively reduces computational consumption in the modelling process and improves the model’s convergence rate.
文摘The scheduling of construction equipment is a means to realize network planning.With the large-scale and low-cost requirements of engineering construction,the cooperation among members of the engineering supply chain has become very important,and effective coordination of project plans at all levels to optimize the resource management and scheduling of a project is helpful to reduce project duration and cost.In this paper,under the milestone constraint conditions,the scheduling problems of multiple construction devices in the same sequence of operation were described and hypothesized mathematically,and the scheduling models of multiple equipment were established.The Palmer algorithm,CDS algorithm and Gupta algorithm were respectively used to solve the optimal scheduling of construction equipment to achieve the optimization of the construction period.The optimization scheduling of a single construction device and multiple construction devices was solved by using sequencing theory under milestone constraint,and these methods can obtain reasonable results,which has important guiding significance for the scheduling of construction equipment.
基金Shanghai Municipal Natural Science Foundation of China(No.10ZR1431700)
文摘Aiming at the flexible flowshop group scheduling problem,taking sequence dependent setup time and machine skipping into account, a mathematical model for minimizing makespan is established,and a hybrid differential evolution( HDE) algorithm based on greedy constructive procedure( GCP) is proposed,which combines differential evolution( DE) with tabu search( TS). DE is applied to generating the elite individuals of population,while TS is used for finding the optimal value by making perturbation in selected elite individuals. A lower bounding technique is developed to evaluate the quality of proposed algorithm. Experimental results verify the effectiveness and feasibility of proposed algorithm.
基金supported by the National Natural Science Foundation of China(No.62373027).
文摘In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set.
基金Supported by the Natural Science Foundation of Chongqing(General Program,NO.CSTB2022NSCQ-MSX0884)Discipline Teaching Special Project of Yangtze Normal University(csxkjx14)。
文摘In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching.