期刊文献+
共找到280,874篇文章
< 1 2 250 >
每页显示 20 50 100
GSLDWOA: A Feature Selection Algorithm for Intrusion Detection Systems in IIoT
1
作者 Wanwei Huang Huicong Yu +3 位作者 Jiawei Ren Kun Wang Yanbu Guo Lifeng Jin 《Computers, Materials & Continua》 2026年第1期2006-2029,共24页
Existing feature selection methods for intrusion detection systems in the Industrial Internet of Things often suffer from local optimality and high computational complexity.These challenges hinder traditional IDS from... Existing feature selection methods for intrusion detection systems in the Industrial Internet of Things often suffer from local optimality and high computational complexity.These challenges hinder traditional IDS from effectively extracting features while maintaining detection accuracy.This paper proposes an industrial Internet ofThings intrusion detection feature selection algorithm based on an improved whale optimization algorithm(GSLDWOA).The aim is to address the problems that feature selection algorithms under high-dimensional data are prone to,such as local optimality,long detection time,and reduced accuracy.First,the initial population’s diversity is increased using the Gaussian Mutation mechanism.Then,Non-linear Shrinking Factor balances global exploration and local development,avoiding premature convergence.Lastly,Variable-step Levy Flight operator and Dynamic Differential Evolution strategy are introduced to improve the algorithm’s search efficiency and convergence accuracy in highdimensional feature space.Experiments on the NSL-KDD and WUSTL-IIoT-2021 datasets demonstrate that the feature subset selected by GSLDWOA significantly improves detection performance.Compared to the traditional WOA algorithm,the detection rate and F1-score increased by 3.68%and 4.12%.On the WUSTL-IIoT-2021 dataset,accuracy,recall,and F1-score all exceed 99.9%. 展开更多
关键词 Industrial Internet of Things intrusion detection system feature selection whale optimization algorithm Gaussian mutation
在线阅读 下载PDF
图谱和Kuhn-Munkres算法在图匹配中的应用研究 被引量:9
2
作者 李昌华 李智杰 高阳 《计算机工程与科学》 CSCD 北大核心 2017年第10期1896-1900,共5页
为了对图数据库中的结构化数据进行有效的匹配分析,提出了基于全局结构相似度以及节点位置相似度的Kuhn-Munkres算法。首先对图数据构建全局以及节点位置矩阵,全局相似度矩阵用邻接矩阵的拉普拉斯谱特征构造,位置相似度矩阵首先使用高... 为了对图数据库中的结构化数据进行有效的匹配分析,提出了基于全局结构相似度以及节点位置相似度的Kuhn-Munkres算法。首先对图数据构建全局以及节点位置矩阵,全局相似度矩阵用邻接矩阵的拉普拉斯谱特征构造,位置相似度矩阵首先使用高斯核函数进行节点相对位置的归一化计算,再利用其谱特征构造。节点位置相似度主要描述图所有节点之间的相对位置,弥补了全局结构相似度只刻画图整体结构的不足。最后使用Kuhn-Munkres算法进行图匹配,得到二分图的最大权匹配。实验表明,改进的Kuhn-Munkres算法有效提高了节点之间的匹配正确率。 展开更多
关键词 kuhn-munkres算法 相似度矩阵 二分图 最大权匹配
在线阅读 下载PDF
基于修正Riccati方程与Kuhn-Munkres算法的多传感器跟踪资源分配 被引量:10
3
作者 童俊 单甘霖 《控制与决策》 EI CSCD 北大核心 2012年第5期747-751,共5页
多传感器管理是对一组传感器或测量设备进行自动或半自动控制的一种处理过程,它实现了整体性能的优化和资源的有效利用.在建立多传感器管理中传感器资源分配一般数学模型的基础上,研究基于修正Riccati方程与Kuhn-Munkres算法相结合的多... 多传感器管理是对一组传感器或测量设备进行自动或半自动控制的一种处理过程,它实现了整体性能的优化和资源的有效利用.在建立多传感器管理中传感器资源分配一般数学模型的基础上,研究基于修正Riccati方程与Kuhn-Munkres算法相结合的多传感器跟踪资源分配,同时给出了目标-传感器最优分配解的求解步骤.仿真结果表明了该方法的可行性. 展开更多
关键词 多传感器管理 多传感器跟踪资源分配 修正Riccati方程 价值函数 kuhn-munkres算法
原文传递
针对带约束匹配搜索的扩展Kuhn-Munkres算法 被引量:5
4
作者 王方洋 刘玉铭 《北京师范大学学报(自然科学版)》 CAS CSCD 北大核心 2021年第2期167-172,共6页
提出了扩展的Kuhn-Munkres算法,可解决带下界约束的局部匹配存在性问题,即在匹配全集的给定子集中,搜索得到一个二分图匹配满足其边权和大于给定阈值.扩展Kuhn-Munkres算法构造了一棵以Kuhn-Munkres算法中间过程为节点的搜索树,利用搜... 提出了扩展的Kuhn-Munkres算法,可解决带下界约束的局部匹配存在性问题,即在匹配全集的给定子集中,搜索得到一个二分图匹配满足其边权和大于给定阈值.扩展Kuhn-Munkres算法构造了一棵以Kuhn-Munkres算法中间过程为节点的搜索树,利用搜索优先级和剪枝,将算法时间复杂度降低至二分图匹配全集与给定子集差集规模的多项式函数. 展开更多
关键词 二分图 最优匹配 kuhn-munkres算法
在线阅读 下载PDF
微纳卫星星座的Kuhn-Munkres匹配部署优化方法 被引量:2
5
作者 刘思阳 蒙涛 +1 位作者 雷家坤 金仲和 《宇航学报》 CSCD 北大核心 2021年第7期895-906,共12页
针对将半长轴、升交点赤经、纬度辐角均不同的低轨微纳卫星群部署到同一轨道面不同目标相位的星座部署问题,提出一种基于Kuhn-Munkres(KM)匹配的星座部署优化方法。通过KM算法实现卫星和目标纬度辐角的优化匹配,充分利用J 2摄动,使升交... 针对将半长轴、升交点赤经、纬度辐角均不同的低轨微纳卫星群部署到同一轨道面不同目标相位的星座部署问题,提出一种基于Kuhn-Munkres(KM)匹配的星座部署优化方法。通过KM算法实现卫星和目标纬度辐角的优化匹配,充分利用J 2摄动,使升交点赤经借助半长轴和纬度辐角的部署而得到同步修正,从而节约燃料。仿真结果表明,相比于传统部署方法,在相同约束下,优化后的部署方法使各星平均燃耗减少,各星燃耗量均衡性提高。弥补了传统同轨星座部署中将各星初始位置简化为空间一点且忽略部署过程中的升交点赤经漂移的不足。采用有限常值推力实现轨道机动,适用于携带微推力推进系统的微纳卫星。 展开更多
关键词 微纳卫星星座 星座部署 相位分离 J_(2)摄动 kuhn-munkres(KM)算法 燃料优化
在线阅读 下载PDF
基于多模深度森林和迭代Kuhn-Munkres的动态上车点推荐算法 被引量:1
6
作者 郭羽含 朱茹施 《计算机应用研究》 CSCD 北大核心 2024年第12期3634-3644,共11页
针对现存动态上车点配置模型在大规模算例的全局最优和求解效率方面存在瓶颈的问题,基于乘客步行距离、乘客步行时间、上车点路况指标以及至乘客目的地所需成本四个关键影响因子进行建模,并提出了基于多模深度森林的动态上车点预测算法... 针对现存动态上车点配置模型在大规模算例的全局最优和求解效率方面存在瓶颈的问题,基于乘客步行距离、乘客步行时间、上车点路况指标以及至乘客目的地所需成本四个关键影响因子进行建模,并提出了基于多模深度森林的动态上车点预测算法和一种迭代Kuhn-Munkres上车点配置算法。预测算法融合了多模态决策树结构和深度学习技术以提升模型预测准确性;配置算法通过多场景自适应机制自动调整边权重并选择最优边进行增广,以得到所有乘客和上车点的最优配置。实验结果表明,相较于其他主流预测模型,该预测算法平均绝对误差降低2.705,均方误差降低5.915,可决系数提升0.214,解释方差提升0.195;配置算法在乘客数量占优条件下的平均调度效果相较于实验中其他方案提高了2.04%。这表明预测算法和配置算法具有较高的实用性,且配置算法在处理大规模实例上具有明显优势。 展开更多
关键词 上车点推荐 多模深度森林 迭代kuhn-munkres算法 网约车 城市交通
在线阅读 下载PDF
基于Kuhn-Munkres算法保证认知用户QoS的动态频谱分配 被引量:4
7
作者 叶培青 李莉 +1 位作者 周小平 陈小丹 《上海师范大学学报(自然科学版)》 2013年第2期137-142,共6页
本算法采用图论方法解决认知无线网络动态频谱分配(DSA)问题.首先,根据认知用户的服务质量(QoS)以及空闲信道的状态,分别为认知用户和信道划分优先权.然后,提出一种新的计算方式预计认知用户使用信道可获得的带宽效益.最后,将划分优先... 本算法采用图论方法解决认知无线网络动态频谱分配(DSA)问题.首先,根据认知用户的服务质量(QoS)以及空闲信道的状态,分别为认知用户和信道划分优先权.然后,提出一种新的计算方式预计认知用户使用信道可获得的带宽效益.最后,将划分优先权后的认知用户、信道建立二分图,将带宽效益作为图的权重.在兼顾考虑认知用户的带宽效益和频谱利用率的前提下,使用Kuhn-Munkres算法将信道分配给认知用户.实验仿真结果表明,本算法可以同时优化带宽效益和频谱利用率,在认知用户等待分配信道时间方面也能取得较好服务质量要求. 展开更多
关键词 动态频谱分配 Kuhn—Munkres算法 优先权
在线阅读 下载PDF
基于Kuhn-Munkres算法的船载地波雷达与自动识别系统船只目标航迹关联
8
作者 吴孝贤 纪永刚 《中国海洋大学学报(自然科学版)》 CAS CSCD 北大核心 2024年第11期142-150,共9页
船载地波雷达可以利用船载平台的移动特性扩展探测区域,通过船载地波雷达与AIS船只目标航迹的关联分析,可服务于船载地波雷达目标探测性能评价以及船只目标多手段融合探测等研究。固定坐标系下的最近邻关联方法仅通过某一时刻的距离信... 船载地波雷达可以利用船载平台的移动特性扩展探测区域,通过船载地波雷达与AIS船只目标航迹的关联分析,可服务于船载地波雷达目标探测性能评价以及船只目标多手段融合探测等研究。固定坐标系下的最近邻关联方法仅通过某一时刻的距离信息来判定航迹关联性,在船载平台运动的情况下容易出现关联错误的情况,从而影响对目标的持续跟踪。针对船载地波雷达与AIS目标航迹关联中未考虑平台运动导致目标航迹关联正确率低的问题,本文建立了以船载平台为中心的运动坐标系,在此基础上通过对实测数据的统计分析,构建了基于航迹参数信息的航迹相似度矩阵,并首次将Kuhn-Munkres(KM)算法应用到船载地波雷达航迹关联领域。该方法可以通过对航迹关联权重的不断更新迭代求得航迹之间的最大权匹配,最终得到全局最优的关联匹配结果,有效克服了仅通过单一时刻距离信息进行局部航迹关联的缺点。本文通过船载地波雷达仿真和实测数据的验证,表明在运动坐标系下基于KM算法的船载地波雷达与AIS船只目标航迹可以进行有效的关联,且该方法关联准确率优于传统的最近邻算法。 展开更多
关键词 航迹关联 高频地波雷达 运动坐标系 kuhn-munkres算法
在线阅读 下载PDF
Method for Estimating the State of Health of Lithium-ion Batteries Based on Differential Thermal Voltammetry and Sparrow Search Algorithm-Elman Neural Network 被引量:1
9
作者 Yu Zhang Daoyu Zhang TiezhouWu 《Energy Engineering》 EI 2025年第1期203-220,共18页
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr... Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%. 展开更多
关键词 Lithium-ion battery state of health differential thermal voltammetry Sparrow Search algorithm
在线阅读 下载PDF
Robustness Optimization Algorithm with Multi-Granularity Integration for Scale-Free Networks Against Malicious Attacks 被引量:1
10
作者 ZHANG Yiheng LI Jinhai 《昆明理工大学学报(自然科学版)》 北大核心 2025年第1期54-71,共18页
Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently... Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms. 展开更多
关键词 complex network model MULTI-GRANULARITY scale-free networks ROBUSTNESS algorithm integration
原文传递
Short-TermWind Power Forecast Based on STL-IAOA-iTransformer Algorithm:A Case Study in Northwest China 被引量:2
11
作者 Zhaowei Yang Bo Yang +5 位作者 Wenqi Liu Miwei Li Jiarong Wang Lin Jiang Yiyan Sang Zhenning Pan 《Energy Engineering》 2025年第2期405-430,共26页
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th... Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy. 展开更多
关键词 Short-termwind power forecast improved arithmetic optimization algorithm iTransformer algorithm SimuNPS
在线阅读 下载PDF
A LODBO algorithm for multi-UAV search and rescue path planning in disaster areas 被引量:1
12
作者 Liman Yang Xiangyu Zhang +2 位作者 Zhiping Li Lei Li Yan Shi 《Chinese Journal of Aeronautics》 2025年第2期200-213,共14页
In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms... In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set. 展开更多
关键词 Unmanned aerial vehicle Path planning Meta heuristic algorithm DBO algorithm NP-hard problems
原文传递
Research on Euclidean Algorithm and Reection on Its Teaching
13
作者 ZHANG Shaohua 《应用数学》 北大核心 2025年第1期308-310,共3页
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t... In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching. 展开更多
关键词 Euclid's algorithm Division algorithm Bezout's equation
在线阅读 下载PDF
DDoS Attack Autonomous Detection Model Based on Multi-Strategy Integrate Zebra Optimization Algorithm
14
作者 Chunhui Li Xiaoying Wang +2 位作者 Qingjie Zhang Jiaye Liang Aijing Zhang 《Computers, Materials & Continua》 SCIE EI 2025年第1期645-674,共30页
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol... Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score. 展开更多
关键词 Distributed denial of service attack intrusion detection deep learning zebra optimization algorithm multi-strategy integrated zebra optimization algorithm
在线阅读 下载PDF
Bearing capacity prediction of open caissons in two-layered clays using five tree-based machine learning algorithms 被引量:1
15
作者 Rungroad Suppakul Kongtawan Sangjinda +3 位作者 Wittaya Jitchaijaroen Natakorn Phuksuksakul Suraparb Keawsawasvong Peem Nuaklong 《Intelligent Geoengineering》 2025年第2期55-65,共11页
Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered so... Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design. 展开更多
关键词 Two-layered clay Open caisson Tree-based algorithms FELA Machine learning
在线阅读 下载PDF
Path Planning for Thermal Power Plant Fan Inspection Robot Based on Improved A^(*)Algorithm 被引量:1
16
作者 Wei Zhang Tingfeng Zhang 《Journal of Electronic Research and Application》 2025年第1期233-239,共7页
To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The... To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks. 展开更多
关键词 Power plant fans Inspection robot Path planning Improved A^(*)algorithm
在线阅读 下载PDF
An Algorithm for Cloud-based Web Service Combination Optimization Through Plant Growth Simulation
17
作者 Li Qiang Qin Huawei +1 位作者 Qiao Bingqin Wu Ruifang 《系统仿真学报》 北大核心 2025年第2期462-473,共12页
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base... In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm. 展开更多
关键词 cloud-based service scheduling algorithm resource constraint load optimization cloud computing plant growth simulation algorithm
原文传递
Improved algorithm of multi-mainlobe interference suppression under uncorrelated and coherent conditions 被引量:1
18
作者 CAI Miaohong CHENG Qiang +1 位作者 MENG Jinli ZHAO Dehua 《Journal of Southeast University(English Edition)》 2025年第1期84-90,共7页
A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the s... A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the spatial spectrum and the directions of arrival(DOA)of interferences to overcome the drawbacks associated with conventional adaptive beamforming(ABF)methods.The mainlobe interferences are identified by calculating the correlation coefficients between direction steering vectors(SVs)and rejected by the BMP pretreatment.Then,IAA is subsequently employed to reconstruct a sidelobe interference-plus-noise covariance matrix for the preferable ABF and residual interference suppression.Simulation results demonstrate the excellence of the proposed method over normal methods based on BMP and eigen-projection matrix perprocessing(EMP)under both uncorrelated and coherent circumstances. 展开更多
关键词 mainlobe interference suppression adaptive beamforming spatial spectral estimation iterative adaptive algorithm blocking matrix preprocessing
在线阅读 下载PDF
Intelligent sequential multi-impulse collision avoidance method for non-cooperative spacecraft based on an improved search tree algorithm 被引量:1
19
作者 Xuyang CAO Xin NING +4 位作者 Zheng WANG Suyi LIU Fei CHENG Wenlong LI Xiaobin LIAN 《Chinese Journal of Aeronautics》 2025年第4期378-393,共16页
The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making co... The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making collision avoidance significantly more challenging than that for space debris.Much existing research focuses on the continuous thrust model,whereas the impulsive maneuver model is more appropriate for long-duration and long-distance avoidance missions.Additionally,it is important to minimize the impact on the original mission while avoiding noncooperative targets.On the other hand,the existing avoidance algorithms are computationally complex and time-consuming especially with the limited computing capability of the on-board computer,posing challenges for practical engineering applications.To conquer these difficulties,this paper makes the following key contributions:(A)a turn-based(sequential decision-making)limited-area impulsive collision avoidance model considering the time delay of precision orbit determination is established for the first time;(B)a novel Selection Probability Learning Adaptive Search-depth Search Tree(SPL-ASST)algorithm is proposed for non-cooperative target avoidance,which improves the decision-making efficiency by introducing an adaptive-search-depth mechanism and a neural network into the traditional Monte Carlo Tree Search(MCTS).Numerical simulations confirm the effectiveness and efficiency of the proposed method. 展开更多
关键词 Non-cooperative target Collision avoidance Limited motion area Impulsive maneuver model Search tree algorithm Neural networks
原文传递
A Class of Parallel Algorithm for Solving Low-rank Tensor Completion
20
作者 LIU Tingyan WEN Ruiping 《应用数学》 北大核心 2025年第4期1134-1144,共11页
In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice ... In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice matrix under unfold operator,and then the fold operator is used to form the next iteration tensor such that the computing time can be decreased.In theory,we analyze the global convergence of the algorithm.In numerical experiment,the simulation data and real image inpainting are carried out.Experiment results show the parallel algorithm outperform its original algorithm in CPU times under the same precision. 展开更多
关键词 Tensor completion Low-rank CONVERGENCE Parallel algorithm
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部