This paper considers an ant colony optimization algorithm based on AND/OR graph for integrated process planning and scheduling(IPPS). Generally, the process planning and scheduling are studied separately. Due to the c...This paper considers an ant colony optimization algorithm based on AND/OR graph for integrated process planning and scheduling(IPPS). Generally, the process planning and scheduling are studied separately. Due to the complexity of manufacturing system, IPPS combining both process planning and scheduling can depict the real situation of a manufacturing system. The IPPS is represented on AND/OR graph consisting of nodes, and undirected and directed arcs. The nodes denote operations of jobs, and undirected/directed arcs denote possible visiting path among the nodes. Ant colony goes through the necessary nodes on the graph from the starting node to the end node to obtain the optimal solution with the objective of minimizing makespan. In order to avoid local convergence and low convergence, some improved strategy is incorporated in the standard ant colony optimization algorithm. Extensive computational experiments are carried out to study the influence of various parameters on the system performance.展开更多
Owing to the constraints of depth sensing technology,images acquired by depth cameras are inevitably mixed with various noises.For depth maps presented in gray values,this research proposes a novel denoising model,ter...Owing to the constraints of depth sensing technology,images acquired by depth cameras are inevitably mixed with various noises.For depth maps presented in gray values,this research proposes a novel denoising model,termed graph-based transform(GBT)and dual graph Laplacian regularization(DGLR)(DGLR-GBT).This model specifically aims to remove Gaussian white noise by capitalizing on the nonlocal self-similarity(NSS)and the piecewise smoothness properties intrinsic to depth maps.Within the group sparse coding(GSC)framework,a combination of GBT and DGLR is implemented.Firstly,within each group,the graph is constructed by using estimates of the true values of the averaged blocks instead of the observations.Secondly,the graph Laplacian regular terms are constructed based on rows and columns of similar block groups,respectively.Lastly,the solution is obtained effectively by combining the alternating direction multiplication method(ADMM)with the weighted thresholding method within the domain of GBT.展开更多
In this study, we used the multi-resolution graph-based clustering (MRGC) method for determining the electrofacies (EF) and lithofacies (LF) from well log data obtained from the intraplatform bank gas fields loc...In this study, we used the multi-resolution graph-based clustering (MRGC) method for determining the electrofacies (EF) and lithofacies (LF) from well log data obtained from the intraplatform bank gas fields located in the Amu Darya Basin. The MRGC could automatically determine the optimal number of clusters without prior knowledge about the structure or cluster numbers of the analyzed data set and allowed the users to control the level of detail actually needed to define the EF. Based on the LF identification and successful EF calibration using core data, an MRGC EF partition model including five clusters and a quantitative LF interpretation chart were constructed. The EF clusters 1 to 5 were interpreted as lagoon, anhydrite flat, interbank, low-energy bank, and high-energy bank, and the coincidence rate in the cored interval could reach 85%. We concluded that the MRGC could be accurately applied to predict the LF in non-cored but logged wells. Therefore, continuous EF clusters were partitioned and corresponding LF were characteristics &different LF were analyzed interpreted, and the distribution and petrophysical in the framework of sequence stratigraphy.展开更多
A better understanding of the relationship between the structure and functions of urban and suburban spaces is one of the avenues of research still open for geographical information science.The research presented in t...A better understanding of the relationship between the structure and functions of urban and suburban spaces is one of the avenues of research still open for geographical information science.The research presented in this paper develops several graph-based metrics whose objective is to characterize some local and global structural properties that reflect the way the overall building layout can be cross-related to the one of the road layout.Such structural properties are modeled as an aggregation of parcels,buildings,and road networks.We introduce several computational measures(Ratio Minimum Distance,Minimum Ratio Minimum Distance,and Metric Compactness)that respectively evaluate the capability for a given road to be connected with the whole road network.These measures reveal emerging sub-network structures and point out differences between less-connective and moreconnective parts of the network.Based on these local and global properties derived from the topological and graph-based representation,and on building density metrics,this paper proposes an analysis of road and building layouts at different levels of granularity.The metrics developed are applied to a case study in which the derived properties reveal coherent as well as incoherent neighborhoods that illustrate the potential of the approach and the way buildings and roads can be relatively connected in a given urban environment.Overall,and by integrating the parcels and buildings layouts,this approach complements other previous and related works that mainly retain the configurational structure of the urban network as well as morphological studies whose focus is generally limited to the analysis of the building layout.展开更多
Simultaneous localization and mapping(SLAM)is widely used in many robot applications to acquire the unknown environment's map and the robots location.Graph-based SLAM is demonstrated to be effective in large-scale...Simultaneous localization and mapping(SLAM)is widely used in many robot applications to acquire the unknown environment's map and the robots location.Graph-based SLAM is demonstrated to be effective in large-scale scenarios,and it intuitively performs the SLAM as a pose graph.But because of the high data overlap rate,traditional graph-based SLAM is not efficient in some respects,such as real time performance and memory usage.To reduce1 data overlap rate,a graph-based SLAM with distributed submap strategy(DSS)is presented.In its front-end,submap based scan matching is processed and loop closing detection is conducted.Moreover in its back-end,pose graph is updated for global optimization and submap merging.From a series of experiments,it is demonstrated that graph-based SLAM with DSS reduces 51.79%data overlap rate,decreases 39.70%runtime and 24.60%memory usage.The advantages over other low overlap rate method is also proved in runtime,memory usage,accuracy and robustness performance.展开更多
The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based fea...The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based features has been extensively studied in the literature.Existing flow-based detection methods involve significant computational overhead that does not completely capture network communication patterns that might reveal other features ofmalicious hosts.Recently,Graph-Based Bot Detection methods using ML have gained attention to overcome these limitations,as graphs provide a real representation of network communications.The purpose of this study is to build a botnet malware detection system utilizing centrality measures for graph-based botnet detection and ML.We propose BotSward,a graph-based bot detection system that is based on ML.We apply the efficient centrality measures,which are Closeness Centrality(CC),Degree Centrality(CC),and PageRank(PR),and compare them with others used in the state-of-the-art.The efficiency of the proposed method is verified on the available Czech Technical University 13 dataset(CTU-13).The CTU-13 dataset contains 13 real botnet traffic scenarios that are connected to a command-and-control(C&C)channel and that cause malicious actions such as phishing,distributed denial-of-service(DDoS)attacks,spam attacks,etc.BotSward is robust to zero-day attacks,suitable for large-scale datasets,and is intended to produce better accuracy than state-of-the-art techniques.The proposed BotSward solution achieved 99%accuracy in botnet attack detection with a false positive rate as low as 0.0001%.展开更多
Maximizing network lifetime is measured as the primary issue in Mobile Ad-hoc Networks(MANETs).In geographically routing based models,packet transmission seems to be more appropriate in dense circumstances.The involve...Maximizing network lifetime is measured as the primary issue in Mobile Ad-hoc Networks(MANETs).In geographically routing based models,packet transmission seems to be more appropriate in dense circumstances.The involvement of the Heuristic model directly is not appropriate to offer an effectual solution as it becomes NP-hard issues;therefore investigators concentrate on using Meta-heuristic approaches.Dragonfly Optimization(DFO)is an effective meta-heuristic approach to resolve these problems by providing optimal solutions.Moreover,Meta-heuristic approaches(DFO)turn to be slower in convergence problems and need proper computational time while expanding network size.Thus,DFO is adaptively improved as Adaptive Dragonfly Optimization(ADFO)to fit this model and re-formulated using graph-based m-connection establishment(G-𝑚𝑚CE)to overcome computational time and DFO’s convergence based problems,considerably enhancing DFO performance.In(G-𝑚𝑚CE),Connectivity Zone(CZ)is chosen among source to destination in which optimality should be under those connected regions and ADFO is used for effective route establishment in CZ indeed of complete networking model.To measure complementary features of ADFO and(G-𝑚𝑚CE),hybridization of DFO-(G-𝑚𝑚CE)is anticipated over dense circumstances with reduced energy consumption and delay to enhance network lifetime.The simulation was performed in MATLAB environment.展开更多
Active learning in semi-supervised classification involves introducing additional labels for unlabelled data to improve the accuracy of the underlying classifier.A challenge is to identify which points to label to bes...Active learning in semi-supervised classification involves introducing additional labels for unlabelled data to improve the accuracy of the underlying classifier.A challenge is to identify which points to label to best improve performance while limiting the number of new labels."Model Change"active learning quantifies the resulting change incurred in the classifier by introducing the additional label(s).We pair this idea with graph-based semi-supervised learning(SSL)methods,that use the spectrum of the graph Laplacian matrix,which can be truncated to avoid prohibitively large computational and storage costs.We consider a family of convex loss functions for which the acquisition function can be efficiently approximated using the Laplace approximation of the posterior distribution.We show a variety of multiclass examples that illustrate improved performance over prior state-of-art.展开更多
Many cutting-edge methods are now possible in real-time commercial settings and are growing in popularity on cloud platforms.By incorporating new,cutting-edge technologies to a larger extent without using more infrast...Many cutting-edge methods are now possible in real-time commercial settings and are growing in popularity on cloud platforms.By incorporating new,cutting-edge technologies to a larger extent without using more infrastructures,the information technology platform is anticipating a completely new level of devel-opment.The following concepts are proposed in this research paper:1)A reliable authentication method Data replication that is optimised;graph-based data encryp-tion and packing colouring in Redundant Array of Independent Disks(RAID)sto-rage.At the data centre,data is encrypted using crypto keys called Key Streams.These keys are produced using the packing colouring method in the web graph’s jump graph.In order to achieve space efficiency,the replication is carried out on optimised many servers employing packing colours.It would be thought that more connections would provide better authentication.This study provides an innovative architecture with robust security,enhanced authentication,and low cost.展开更多
Ants rank among the most ecologically dominant and evolutionarily remarkable insects on the planet,capturing the imagination of both curious children and thoughtful scholars alike.Aristotle,impressed by their division...Ants rank among the most ecologically dominant and evolutionarily remarkable insects on the planet,capturing the imagination of both curious children and thoughtful scholars alike.Aristotle,impressed by their division of labor and cooperative behavior,described them as“political animals”.In Aesop’s Fables,they are celebrated for their foresight and diligence in preparing for hardship.Traditional Chinese narratives similarly portray ants as modest creatures that,through collective effort,achieve extraordinary power and influence.展开更多
The ability of queens and males of most ant species to disperse by flight has fundamentally contributed to the group’s evolutionary and ecological success and is a determining factor to take into account for biogeogr...The ability of queens and males of most ant species to disperse by flight has fundamentally contributed to the group’s evolutionary and ecological success and is a determining factor to take into account for biogeographic studies(Wagner and Liebherr 1992;Peeters and Ito 2001;Helms 2018).展开更多
Hierarchical Task Network(HTN)planning is a powerful technique in artificial intelligence for handling complex problems by decomposing them into hierarchical task structures.However,achieving optimal solutions in HTN ...Hierarchical Task Network(HTN)planning is a powerful technique in artificial intelligence for handling complex problems by decomposing them into hierarchical task structures.However,achieving optimal solutions in HTN planning remains a challenge,especially in scenarios where traditional search algorithms struggle to navigate the vast solution space efficiently.This research proposes a novel technique to enhance HTN planning by integrating the Ant Colony Optimization(ACO)algorithm into the refinement process.The Ant System algorithm,inspired by the foraging behavior of ants,is well-suited for addressing optimization problems by efficiently exploring solution spaces.By incorporating ACO into the refinement phase of HTN planning,the authors aim to leverage its adaptive nature and decentralized decision-making to improve plan generation.This paper involves the development of a hybrid strategy called ACO-HTN,which combines HTN planning with ACO-based plan selection.This technique enables the system to adaptively refine plans by guiding the search towards optimal solutions.To evaluate the effectiveness of the proposed technique,this paper conducts empirical experiments on various domains and benchmark datasets.Our results demonstrate that the ACO-HTN strategy enhances the efficiency and effectiveness of HTN planning,outperforming traditional methods in terms of solution quality and computational performance.展开更多
Ant colony optimization(ACO)is a random search algorithm based on probability calculation.However,the uninformed search strategy has a slow convergence speed.The Bayesian algorithm uses the historical information of t...Ant colony optimization(ACO)is a random search algorithm based on probability calculation.However,the uninformed search strategy has a slow convergence speed.The Bayesian algorithm uses the historical information of the searched point to determine the next search point during the search process,reducing the uncertainty in the random search process.Due to the ability of the Bayesian algorithm to reduce uncertainty,a Bayesian ACO algorithm is proposed in this paper to increase the convergence speed of the conventional ACO algorithm for image edge detection.In addition,this paper has the following two innovations on the basis of the classical algorithm,one of which is to add random perturbations after completing the pheromone update.The second is the use of adaptive pheromone heuristics.Experimental results illustrate that the proposed Bayesian ACO algorithm has faster convergence and higher precision and recall than the traditional ant colony algorithm,due to the improvement of the pheromone utilization rate.Moreover,Bayesian ACO algorithm outperforms the other comparative methods in edge detection task.展开更多
This paper reviews the historical application,primary species,and efficacy of predatory ants in pest management,and systematically elucidates their core mechanisms,including direct predation,non-lethal deterrence,and ...This paper reviews the historical application,primary species,and efficacy of predatory ants in pest management,and systematically elucidates their core mechanisms,including direct predation,non-lethal deterrence,and ecological regulation,as well as two application models:forest land introduction and farmland conservation.Meanwhile,the negative impacts on other arthropods,population diffusion,and management challenges in current applications are analyzed.Future research directions,such as precise assessment and risk control,integrated pest management(IPM)strategies integrating multiple technologies,and applications of molecular ecology,are clarified.The aim is to provide a reference for the in-depth research,promotion,and application of this type of natural enemy insect in sustainable IPM of agriculture and forestry.展开更多
基金Supported by the Fundamental Research Funds for the Central Universities(13MS100)the Hebei Province Research Foundation of Natural Science(E2011502024)the National Natural Science Foundation of China(51177046)
文摘This paper considers an ant colony optimization algorithm based on AND/OR graph for integrated process planning and scheduling(IPPS). Generally, the process planning and scheduling are studied separately. Due to the complexity of manufacturing system, IPPS combining both process planning and scheduling can depict the real situation of a manufacturing system. The IPPS is represented on AND/OR graph consisting of nodes, and undirected and directed arcs. The nodes denote operations of jobs, and undirected/directed arcs denote possible visiting path among the nodes. Ant colony goes through the necessary nodes on the graph from the starting node to the end node to obtain the optimal solution with the objective of minimizing makespan. In order to avoid local convergence and low convergence, some improved strategy is incorporated in the standard ant colony optimization algorithm. Extensive computational experiments are carried out to study the influence of various parameters on the system performance.
基金National Natural Science Foundation of China(No.62372100)。
文摘Owing to the constraints of depth sensing technology,images acquired by depth cameras are inevitably mixed with various noises.For depth maps presented in gray values,this research proposes a novel denoising model,termed graph-based transform(GBT)and dual graph Laplacian regularization(DGLR)(DGLR-GBT).This model specifically aims to remove Gaussian white noise by capitalizing on the nonlocal self-similarity(NSS)and the piecewise smoothness properties intrinsic to depth maps.Within the group sparse coding(GSC)framework,a combination of GBT and DGLR is implemented.Firstly,within each group,the graph is constructed by using estimates of the true values of the averaged blocks instead of the observations.Secondly,the graph Laplacian regular terms are constructed based on rows and columns of similar block groups,respectively.Lastly,the solution is obtained effectively by combining the alternating direction multiplication method(ADMM)with the weighted thresholding method within the domain of GBT.
基金supported by the National Science and Technology Major Project of China(No.2011ZX05029-003)CNPC Science Research and Technology Development Project,China(No.2013D-0904)
文摘In this study, we used the multi-resolution graph-based clustering (MRGC) method for determining the electrofacies (EF) and lithofacies (LF) from well log data obtained from the intraplatform bank gas fields located in the Amu Darya Basin. The MRGC could automatically determine the optimal number of clusters without prior knowledge about the structure or cluster numbers of the analyzed data set and allowed the users to control the level of detail actually needed to define the EF. Based on the LF identification and successful EF calibration using core data, an MRGC EF partition model including five clusters and a quantitative LF interpretation chart were constructed. The EF clusters 1 to 5 were interpreted as lagoon, anhydrite flat, interbank, low-energy bank, and high-energy bank, and the coincidence rate in the cored interval could reach 85%. We concluded that the MRGC could be accurately applied to predict the LF in non-cored but logged wells. Therefore, continuous EF clusters were partitioned and corresponding LF were characteristics &different LF were analyzed interpreted, and the distribution and petrophysical in the framework of sequence stratigraphy.
文摘A better understanding of the relationship between the structure and functions of urban and suburban spaces is one of the avenues of research still open for geographical information science.The research presented in this paper develops several graph-based metrics whose objective is to characterize some local and global structural properties that reflect the way the overall building layout can be cross-related to the one of the road layout.Such structural properties are modeled as an aggregation of parcels,buildings,and road networks.We introduce several computational measures(Ratio Minimum Distance,Minimum Ratio Minimum Distance,and Metric Compactness)that respectively evaluate the capability for a given road to be connected with the whole road network.These measures reveal emerging sub-network structures and point out differences between less-connective and moreconnective parts of the network.Based on these local and global properties derived from the topological and graph-based representation,and on building density metrics,this paper proposes an analysis of road and building layouts at different levels of granularity.The metrics developed are applied to a case study in which the derived properties reveal coherent as well as incoherent neighborhoods that illustrate the potential of the approach and the way buildings and roads can be relatively connected in a given urban environment.Overall,and by integrating the parcels and buildings layouts,this approach complements other previous and related works that mainly retain the configurational structure of the urban network as well as morphological studies whose focus is generally limited to the analysis of the building layout.
基金the Project Fund for Key Discipline of the Shanghai Municipal Education Commission(No.J50104)the Major State Basic Research Development Program of China(No.2017YFB0403500)。
文摘Simultaneous localization and mapping(SLAM)is widely used in many robot applications to acquire the unknown environment's map and the robots location.Graph-based SLAM is demonstrated to be effective in large-scale scenarios,and it intuitively performs the SLAM as a pose graph.But because of the high data overlap rate,traditional graph-based SLAM is not efficient in some respects,such as real time performance and memory usage.To reduce1 data overlap rate,a graph-based SLAM with distributed submap strategy(DSS)is presented.In its front-end,submap based scan matching is processed and loop closing detection is conducted.Moreover in its back-end,pose graph is updated for global optimization and submap merging.From a series of experiments,it is demonstrated that graph-based SLAM with DSS reduces 51.79%data overlap rate,decreases 39.70%runtime and 24.60%memory usage.The advantages over other low overlap rate method is also proved in runtime,memory usage,accuracy and robustness performance.
文摘The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based features has been extensively studied in the literature.Existing flow-based detection methods involve significant computational overhead that does not completely capture network communication patterns that might reveal other features ofmalicious hosts.Recently,Graph-Based Bot Detection methods using ML have gained attention to overcome these limitations,as graphs provide a real representation of network communications.The purpose of this study is to build a botnet malware detection system utilizing centrality measures for graph-based botnet detection and ML.We propose BotSward,a graph-based bot detection system that is based on ML.We apply the efficient centrality measures,which are Closeness Centrality(CC),Degree Centrality(CC),and PageRank(PR),and compare them with others used in the state-of-the-art.The efficiency of the proposed method is verified on the available Czech Technical University 13 dataset(CTU-13).The CTU-13 dataset contains 13 real botnet traffic scenarios that are connected to a command-and-control(C&C)channel and that cause malicious actions such as phishing,distributed denial-of-service(DDoS)attacks,spam attacks,etc.BotSward is robust to zero-day attacks,suitable for large-scale datasets,and is intended to produce better accuracy than state-of-the-art techniques.The proposed BotSward solution achieved 99%accuracy in botnet attack detection with a false positive rate as low as 0.0001%.
文摘Maximizing network lifetime is measured as the primary issue in Mobile Ad-hoc Networks(MANETs).In geographically routing based models,packet transmission seems to be more appropriate in dense circumstances.The involvement of the Heuristic model directly is not appropriate to offer an effectual solution as it becomes NP-hard issues;therefore investigators concentrate on using Meta-heuristic approaches.Dragonfly Optimization(DFO)is an effective meta-heuristic approach to resolve these problems by providing optimal solutions.Moreover,Meta-heuristic approaches(DFO)turn to be slower in convergence problems and need proper computational time while expanding network size.Thus,DFO is adaptively improved as Adaptive Dragonfly Optimization(ADFO)to fit this model and re-formulated using graph-based m-connection establishment(G-𝑚𝑚CE)to overcome computational time and DFO’s convergence based problems,considerably enhancing DFO performance.In(G-𝑚𝑚CE),Connectivity Zone(CZ)is chosen among source to destination in which optimality should be under those connected regions and ADFO is used for effective route establishment in CZ indeed of complete networking model.To measure complementary features of ADFO and(G-𝑚𝑚CE),hybridization of DFO-(G-𝑚𝑚CE)is anticipated over dense circumstances with reduced energy consumption and delay to enhance network lifetime.The simulation was performed in MATLAB environment.
基金supported by the DOD National Defense Science and Engineering Graduate(NDSEG)Research Fellowshipsupported by the NGA under Contract No.HM04762110003.
文摘Active learning in semi-supervised classification involves introducing additional labels for unlabelled data to improve the accuracy of the underlying classifier.A challenge is to identify which points to label to best improve performance while limiting the number of new labels."Model Change"active learning quantifies the resulting change incurred in the classifier by introducing the additional label(s).We pair this idea with graph-based semi-supervised learning(SSL)methods,that use the spectrum of the graph Laplacian matrix,which can be truncated to avoid prohibitively large computational and storage costs.We consider a family of convex loss functions for which the acquisition function can be efficiently approximated using the Laplace approximation of the posterior distribution.We show a variety of multiclass examples that illustrate improved performance over prior state-of-art.
文摘Many cutting-edge methods are now possible in real-time commercial settings and are growing in popularity on cloud platforms.By incorporating new,cutting-edge technologies to a larger extent without using more infrastructures,the information technology platform is anticipating a completely new level of devel-opment.The following concepts are proposed in this research paper:1)A reliable authentication method Data replication that is optimised;graph-based data encryp-tion and packing colouring in Redundant Array of Independent Disks(RAID)sto-rage.At the data centre,data is encrypted using crypto keys called Key Streams.These keys are produced using the packing colouring method in the web graph’s jump graph.In order to achieve space efficiency,the replication is carried out on optimised many servers employing packing colours.It would be thought that more connections would provide better authentication.This study provides an innovative architecture with robust security,enhanced authentication,and low cost.
基金supported by the National Natural Science Foundation of China(32388102 to G.Z.,32370668 to W.L.)Yunnan Provincial Science and Technology Department,Yunnan Fundamental Research Projects(202201AT070129 and 202401BC070017 to W.L.)。
文摘Ants rank among the most ecologically dominant and evolutionarily remarkable insects on the planet,capturing the imagination of both curious children and thoughtful scholars alike.Aristotle,impressed by their division of labor and cooperative behavior,described them as“political animals”.In Aesop’s Fables,they are celebrated for their foresight and diligence in preparing for hardship.Traditional Chinese narratives similarly portray ants as modest creatures that,through collective effort,achieve extraordinary power and influence.
基金funded by the“Departments of Excellence”program of the Italian Ministry for University and Research(MIUR,2018-2022 and MUR,2023-2027).
文摘The ability of queens and males of most ant species to disperse by flight has fundamentally contributed to the group’s evolutionary and ecological success and is a determining factor to take into account for biogeographic studies(Wagner and Liebherr 1992;Peeters and Ito 2001;Helms 2018).
基金supported by the Ministry of Science and High Education of the Russian Federation by the grant 075-15-2022-1137supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R323),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Hierarchical Task Network(HTN)planning is a powerful technique in artificial intelligence for handling complex problems by decomposing them into hierarchical task structures.However,achieving optimal solutions in HTN planning remains a challenge,especially in scenarios where traditional search algorithms struggle to navigate the vast solution space efficiently.This research proposes a novel technique to enhance HTN planning by integrating the Ant Colony Optimization(ACO)algorithm into the refinement process.The Ant System algorithm,inspired by the foraging behavior of ants,is well-suited for addressing optimization problems by efficiently exploring solution spaces.By incorporating ACO into the refinement phase of HTN planning,the authors aim to leverage its adaptive nature and decentralized decision-making to improve plan generation.This paper involves the development of a hybrid strategy called ACO-HTN,which combines HTN planning with ACO-based plan selection.This technique enables the system to adaptively refine plans by guiding the search towards optimal solutions.To evaluate the effectiveness of the proposed technique,this paper conducts empirical experiments on various domains and benchmark datasets.Our results demonstrate that the ACO-HTN strategy enhances the efficiency and effectiveness of HTN planning,outperforming traditional methods in terms of solution quality and computational performance.
基金supported by the National Natural Science Foundation of China(62276055).
文摘Ant colony optimization(ACO)is a random search algorithm based on probability calculation.However,the uninformed search strategy has a slow convergence speed.The Bayesian algorithm uses the historical information of the searched point to determine the next search point during the search process,reducing the uncertainty in the random search process.Due to the ability of the Bayesian algorithm to reduce uncertainty,a Bayesian ACO algorithm is proposed in this paper to increase the convergence speed of the conventional ACO algorithm for image edge detection.In addition,this paper has the following two innovations on the basis of the classical algorithm,one of which is to add random perturbations after completing the pheromone update.The second is the use of adaptive pheromone heuristics.Experimental results illustrate that the proposed Bayesian ACO algorithm has faster convergence and higher precision and recall than the traditional ant colony algorithm,due to the improvement of the pheromone utilization rate.Moreover,Bayesian ACO algorithm outperforms the other comparative methods in edge detection task.
基金Supported by National Innovation and Entrepreneurship Training Program for College Students(202410580010)Key Projects of the Second Round Project of High-quality Development in Hundred Counties,Thousands Towns and Ten Thousand Villages for Rural Science and Technology Special Commissioners Dispatched by the Guangdong Provincial Department of Science and Technology(KTP20240684)+1 种基金the 2025 Science and Technology Innovation Guidance Project of Zhaoqing(241223100090425)Doctoral Scientific Research Initiation Fund Project of Zhaoqing University(611/230009).
文摘This paper reviews the historical application,primary species,and efficacy of predatory ants in pest management,and systematically elucidates their core mechanisms,including direct predation,non-lethal deterrence,and ecological regulation,as well as two application models:forest land introduction and farmland conservation.Meanwhile,the negative impacts on other arthropods,population diffusion,and management challenges in current applications are analyzed.Future research directions,such as precise assessment and risk control,integrated pest management(IPM)strategies integrating multiple technologies,and applications of molecular ecology,are clarified.The aim is to provide a reference for the in-depth research,promotion,and application of this type of natural enemy insect in sustainable IPM of agriculture and forestry.