期刊文献+
共找到280,359篇文章
< 1 2 250 >
每页显示 20 50 100
基于PSAF-LMS算法的多象限周视激光引信抗云雾干扰方法
1
作者 查冰婷 徐光博 +1 位作者 秦建新 张合 《兵工学报》 北大核心 2025年第2期275-286,共12页
叠加在目标回波上的云雾后向散射信号是影响空空导弹周视激光引信测距精度的重要因素。针对目前抗云雾干扰方法适应性差、处理时效低等问题,提出一种基于可暂停样条自适应滤波的最小均方(Pauseable Spline Adaptive Filter-Least Mean S... 叠加在目标回波上的云雾后向散射信号是影响空空导弹周视激光引信测距精度的重要因素。针对目前抗云雾干扰方法适应性差、处理时效低等问题,提出一种基于可暂停样条自适应滤波的最小均方(Pauseable Spline Adaptive Filter-Least Mean Square,PSAF-LMS)算法,并设计了算法在现场可编程门阵列(Field-Programmable Gate Array,FPGA)与ARM的联合实现方案。PSAF-LMS算法可有效减少滤波器的稳态误差,并提高激光引信的时刻鉴别精度和抗干扰能力。此外,利用不同信噪比的目标回波信号进行仿真,并开展了云雾环境滤波效果模拟验证试验。研究结果表明:所提算法能够在34.85μs内有效滤除后向散射,并保留目标波峰原始变化趋势,滤波前后信噪比平均可提高25.15 dB以上。 展开更多
关键词 激光引信 后向散射 自适应滤波 样条最小均方算法
在线阅读 下载PDF
基于GWO-LMS-RSSD的旋转机械耦合故障分离及特征强化方法
2
作者 许文 施卫华 +3 位作者 李红钢 华如南 刘厚林 董亮 《机电工程》 北大核心 2025年第4期677-685,共9页
针对旋转机械耦合故障中较弱故障易被较强故障淹没及噪声干扰严重的问题,提出了基于灰狼优化算法(GWO)的自适应滤波最小均方(LMS)算法,结合共振稀疏分解(RSSD)的耦合故障特征分离及强化方法。首先,采用自适应滤波LMS算法对耦合故障信号... 针对旋转机械耦合故障中较弱故障易被较强故障淹没及噪声干扰严重的问题,提出了基于灰狼优化算法(GWO)的自适应滤波最小均方(LMS)算法,结合共振稀疏分解(RSSD)的耦合故障特征分离及强化方法。首先,采用自适应滤波LMS算法对耦合故障信号进行了滤波处理,使故障特征得到了初步强化;然后,根据耦合故障的不同共振属性,利用RSSD算法将故障耦合分解为高共振分量和低共振分量,完成了耦合故障分离;特别地,针对LMS算法中参数依赖人工经验、自适应差等问题,研究了基于灰狼优化算法(GWO)的参数自适应优化方法,设计了以信噪比和均方误差构成的优化目标;最后,对稀疏分解得到的信号进行了包络解调,完成了耦合故障分离及特征强化,同时,利用模拟信号和实验信号对该方法进行了验证分析。研究结果表明:GWO-LMS-RSSD算法能用于有效降低噪声干扰,分离旋转机械耦合故障及强化故障特征。该研究成果可为强噪声干扰下耦合故障的特征分离及强化提供一种新的思路。 展开更多
关键词 耦合故障诊断 旋转机械 共振稀疏分解 自适应滤波最小均方算法 灰狼优化算法 信噪比 均方误差
在线阅读 下载PDF
Underwater four-quadrant dual-beam circumferential scanning laser fuze using nonlinear adaptive backscatter filter based on pauseable SAF-LMS algorithm 被引量:3
3
作者 Guangbo Xu Bingting Zha +2 位作者 Hailu Yuan Zhen Zheng He Zhang 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第7期1-13,共13页
The phenomenon of a target echo peak overlapping with the backscattered echo peak significantly undermines the detection range and precision of underwater laser fuzes.To overcome this issue,we propose a four-quadrant ... The phenomenon of a target echo peak overlapping with the backscattered echo peak significantly undermines the detection range and precision of underwater laser fuzes.To overcome this issue,we propose a four-quadrant dual-beam circumferential scanning laser fuze to distinguish various interference signals and provide more real-time data for the backscatter filtering algorithm.This enhances the algorithm loading capability of the fuze.In order to address the problem of insufficient filtering capacity in existing linear backscatter filtering algorithms,we develop a nonlinear backscattering adaptive filter based on the spline adaptive filter least mean square(SAF-LMS)algorithm.We also designed an algorithm pause module to retain the original trend of the target echo peak,improving the time discrimination accuracy and anti-interference capability of the fuze.Finally,experiments are conducted with varying signal-to-noise ratios of the original underwater target echo signals.The experimental results show that the average signal-to-noise ratio before and after filtering can be improved by more than31 d B,with an increase of up to 76%in extreme detection distance. 展开更多
关键词 Laser fuze Underwater laser detection Backscatter adaptive filter Spline least mean square algorithm Nonlinear filtering algorithm
在线阅读 下载PDF
SS-LMS自适应均衡算法的CTLE设计
4
作者 唐明华 尤浩龙 +2 位作者 李刚 赵珍阳 陈建军 《国防科技大学学报》 北大核心 2025年第1期190-197,共8页
随着先进工艺和技术的不断进步,要想保证数据在高速传输中的正确性,均衡器需要有更高的补偿和更低的功耗,才能实现高效通信。基于12 nm互补金属氧化物半导体工艺,设计了一种高增益、低功耗的自适应连续时间线性均衡器(continuous time l... 随着先进工艺和技术的不断进步,要想保证数据在高速传输中的正确性,均衡器需要有更高的补偿和更低的功耗,才能实现高效通信。基于12 nm互补金属氧化物半导体工艺,设计了一种高增益、低功耗的自适应连续时间线性均衡器(continuous time linear equalizer,CTLE),该均衡器采用2级级联结构来补偿信道衰减,并提高接收信号的质量。此外,自适应模块通过采用符号-符号最小均方误差(sign-sign least mean square,SS-LMS)算法,使抽头系数加快了收敛速度。仿真结果表明,当传输速率为16 Gbit/s时,均衡器可以补偿-15.53 dB的半波特率通道衰减,均衡器系数在16×10^(4)个单元间隔数据内收敛,并且收敛之后接收误码率低于10^(-12)。 展开更多
关键词 连续时间线性均衡器 自适应 符号-符号最小均方误差算法
在线阅读 下载PDF
基于x-LMS算法的直升机主动消振电力作动器系统研究
5
作者 郝振洋 杨健 +2 位作者 张嘉文 曹鑫 王涛 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI CSCD 2024年第1期53-65,共13页
直升机前飞时,气动环境会导致不同方位角的桨叶出现气动载荷的瞬间不对称,通过基础结构传递会在机身形成大幅度的低频振动。为了消除多方向幅值变化的振动力,利用结构响应主动控制原理,设计了基于x-LMS算法的主动消振电力作动器系统,并... 直升机前飞时,气动环境会导致不同方位角的桨叶出现气动载荷的瞬间不对称,通过基础结构传递会在机身形成大幅度的低频振动。为了消除多方向幅值变化的振动力,利用结构响应主动控制原理,设计了基于x-LMS算法的主动消振电力作动器系统,并进行了减振实验。首先,通过比较确定了单台作动器中两台电机同向旋转的方案。通过两台作动器的组合使用,推导出输出力的数学模型。其次,采用负载相位差交叉耦合的控制策略设计系统控制框图。针对存在耦合的相位外环,通过回差阵特征值法确定满足系统稳定裕度要求的参数范围,再根据灵敏度函数和输入跟踪性能在所得的参数稳定域内寻找最优解。然后,提出了基于x-LMS算法的直升机主动振动控制系统,并通过仿真验证了该系统的减振效果。最后,研制的实验样机进行了动稳态实验以及减振实验,验证了系统的实际减振效果。 展开更多
关键词 结构响应主动控制 消振电力作动器 回差阵特征值法 x-lms算法
在线阅读 下载PDF
Raman spectroscopy de-noising based on EEMD combined with VS-LMS algorithm 被引量:3
6
作者 俞潇 许亮 +1 位作者 莫家庆 吕小毅 《Optoelectronics Letters》 EI 2016年第1期16-19,共4页
This paper proposes a novel de-noising algorithm based on ensemble empirical mode decomposition(EEMD) and the variable step size least mean square(VS-LMS) adaptive filter.The noise of the high frequency part of spectr... This paper proposes a novel de-noising algorithm based on ensemble empirical mode decomposition(EEMD) and the variable step size least mean square(VS-LMS) adaptive filter.The noise of the high frequency part of spectrum will be removed through EEMD,and then the VS-LMS algorithm is utilized for overall de-noising.The EEMD combined with VS-LMS algorithm can not only preserve the detail and envelope of the effective signal,but also improve the system stability.When the method is used on pure R6G,the signal-to-noise ratio(SNR) of Raman spectrum is lower than 10dB.The de-noising superiority of the proposed method in Raman spectrum can be verified by three evaluation standards of SNR,root mean square error(RMSE) and the correlation coefficient ρ. 展开更多
关键词 algorithmS Mean square error Raman scattering System stability
原文传递
Method for Estimating the State of Health of Lithium-ion Batteries Based on Differential Thermal Voltammetry and Sparrow Search Algorithm-Elman Neural Network 被引量:1
7
作者 Yu Zhang Daoyu Zhang TiezhouWu 《Energy Engineering》 EI 2025年第1期203-220,共18页
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr... Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%. 展开更多
关键词 Lithium-ion battery state of health differential thermal voltammetry Sparrow Search algorithm
在线阅读 下载PDF
Robustness Optimization Algorithm with Multi-Granularity Integration for Scale-Free Networks Against Malicious Attacks 被引量:1
8
作者 ZHANG Yiheng LI Jinhai 《昆明理工大学学报(自然科学版)》 北大核心 2025年第1期54-71,共18页
Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently... Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms. 展开更多
关键词 complex network model MULTI-GRANULARITY scale-free networks ROBUSTNESS algorithm integration
原文传递
Short-TermWind Power Forecast Based on STL-IAOA-iTransformer Algorithm:A Case Study in Northwest China 被引量:2
9
作者 Zhaowei Yang Bo Yang +5 位作者 Wenqi Liu Miwei Li Jiarong Wang Lin Jiang Yiyan Sang Zhenning Pan 《Energy Engineering》 2025年第2期405-430,共26页
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th... Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy. 展开更多
关键词 Short-termwind power forecast improved arithmetic optimization algorithm iTransformer algorithm SimuNPS
在线阅读 下载PDF
Cancellation for frequency offset in OFDM system based on TF-LMS algorithm 被引量:2
10
作者 关庆阳 赵洪林 郭庆 《Journal of Central South University》 SCIE EI CAS 2010年第6期1293-1299,共7页
In an orthogonal frequency division multiplexing(OFDM) system,a time and frequency domain least mean square algorithm(TF-LMS) was proposed to cancel the frequency offset(FO).TF-LMS algorithm is composed of two stages.... In an orthogonal frequency division multiplexing(OFDM) system,a time and frequency domain least mean square algorithm(TF-LMS) was proposed to cancel the frequency offset(FO).TF-LMS algorithm is composed of two stages.Firstly,time domain least mean square(TD-LMS) scheme was selected to pre-cancel the frequency offset in the time domain,and then the interference induced by residual frequency offset was eliminated by the frequency domain mean square(FD-LMS) scheme in frequency domain.The results of bit error rate(BER) and quadrature phase shift keying(QPSK) constellation figures show that the performance of the proposed suppression algorithm is excellent. 展开更多
关键词 orthogonal frequency division multiplexing (OFDM) frequency offset least mean square algorithm CANCELLATION
在线阅读 下载PDF
A LODBO algorithm for multi-UAV search and rescue path planning in disaster areas 被引量:1
11
作者 Liman Yang Xiangyu Zhang +2 位作者 Zhiping Li Lei Li Yan Shi 《Chinese Journal of Aeronautics》 2025年第2期200-213,共14页
In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms... In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set. 展开更多
关键词 Unmanned aerial vehicle Path planning Meta heuristic algorithm DBO algorithm NP-hard problems
原文传递
Research on Euclidean Algorithm and Reection on Its Teaching
12
作者 ZHANG Shaohua 《应用数学》 北大核心 2025年第1期308-310,共3页
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t... In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching. 展开更多
关键词 Euclid's algorithm Division algorithm Bezout's equation
在线阅读 下载PDF
DDoS Attack Autonomous Detection Model Based on Multi-Strategy Integrate Zebra Optimization Algorithm
13
作者 Chunhui Li Xiaoying Wang +2 位作者 Qingjie Zhang Jiaye Liang Aijing Zhang 《Computers, Materials & Continua》 SCIE EI 2025年第1期645-674,共30页
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol... Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score. 展开更多
关键词 Distributed denial of service attack intrusion detection deep learning zebra optimization algorithm multi-strategy integrated zebra optimization algorithm
在线阅读 下载PDF
基于IWOA-LMBP的水稻插秧机可靠性预测
14
作者 文昌俊 陈凡 +1 位作者 康锁 陈洋洋 《湖北工业大学学报》 2025年第2期1-10,共10页
针对水稻插秧机可靠性评价存在明显滞后的问题,提出一种创新的解决方案:构建改进鲸鱼算法-列文伯格·马夸德优化的BP神经网络可靠性预测模型。其设计思路如下:首先,引入Chebyshev混沌策略,以增强初始种群的多样性;其次,采用“双阶... 针对水稻插秧机可靠性评价存在明显滞后的问题,提出一种创新的解决方案:构建改进鲸鱼算法-列文伯格·马夸德优化的BP神经网络可靠性预测模型。其设计思路如下:首先,引入Chebyshev混沌策略,以增强初始种群的多样性;其次,采用“双阶梯”和“双山谷”非线性自适应因子,动态平衡算法的全局搜索与局部勘探能力;最后,结合趋优透镜反向学习策略,以更新个体位置,进一步提升个体质量,有效帮助算法跳出局部最优。通过6个基准测试函数的寻优对比分析和Wilcoxon秩和统计检验可知,IWOA具有更好的寻优性能。随后,利用现场跟踪获取的水稻插秧机故障数据,建立IWOA-LMBP模型。为了全面评估该模型的性能,选取MAE、RMSE、R2作为网络模型的评价指标,并将其与其他5种模型进行对比。结果表明:采用IWOA-LMBP模型进行预测时,效果更好。 展开更多
关键词 水稻插秧机 改进鲸鱼算法 趋优透镜反向学习 LMBP神经网络 可靠性预测
在线阅读 下载PDF
Bearing capacity prediction of open caissons in two-layered clays using five tree-based machine learning algorithms 被引量:1
15
作者 Rungroad Suppakul Kongtawan Sangjinda +3 位作者 Wittaya Jitchaijaroen Natakorn Phuksuksakul Suraparb Keawsawasvong Peem Nuaklong 《Intelligent Geoengineering》 2025年第2期55-65,共11页
Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered so... Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design. 展开更多
关键词 Two-layered clay Open caisson Tree-based algorithms FELA Machine learning
在线阅读 下载PDF
Path Planning for Thermal Power Plant Fan Inspection Robot Based on Improved A^(*)Algorithm 被引量:1
16
作者 Wei Zhang Tingfeng Zhang 《Journal of Electronic Research and Application》 2025年第1期233-239,共7页
To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The... To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks. 展开更多
关键词 Power plant fans Inspection robot Path planning Improved A^(*)algorithm
在线阅读 下载PDF
An Algorithm for Cloud-based Web Service Combination Optimization Through Plant Growth Simulation
17
作者 Li Qiang Qin Huawei +1 位作者 Qiao Bingqin Wu Ruifang 《系统仿真学报》 北大核心 2025年第2期462-473,共12页
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base... In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm. 展开更多
关键词 cloud-based service scheduling algorithm resource constraint load optimization cloud computing plant growth simulation algorithm
原文传递
Improved algorithm of multi-mainlobe interference suppression under uncorrelated and coherent conditions 被引量:1
18
作者 CAI Miaohong CHENG Qiang +1 位作者 MENG Jinli ZHAO Dehua 《Journal of Southeast University(English Edition)》 2025年第1期84-90,共7页
A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the s... A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the spatial spectrum and the directions of arrival(DOA)of interferences to overcome the drawbacks associated with conventional adaptive beamforming(ABF)methods.The mainlobe interferences are identified by calculating the correlation coefficients between direction steering vectors(SVs)and rejected by the BMP pretreatment.Then,IAA is subsequently employed to reconstruct a sidelobe interference-plus-noise covariance matrix for the preferable ABF and residual interference suppression.Simulation results demonstrate the excellence of the proposed method over normal methods based on BMP and eigen-projection matrix perprocessing(EMP)under both uncorrelated and coherent circumstances. 展开更多
关键词 mainlobe interference suppression adaptive beamforming spatial spectral estimation iterative adaptive algorithm blocking matrix preprocessing
在线阅读 下载PDF
Intelligent sequential multi-impulse collision avoidance method for non-cooperative spacecraft based on an improved search tree algorithm 被引量:1
19
作者 Xuyang CAO Xin NING +4 位作者 Zheng WANG Suyi LIU Fei CHENG Wenlong LI Xiaobin LIAN 《Chinese Journal of Aeronautics》 2025年第4期378-393,共16页
The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making co... The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making collision avoidance significantly more challenging than that for space debris.Much existing research focuses on the continuous thrust model,whereas the impulsive maneuver model is more appropriate for long-duration and long-distance avoidance missions.Additionally,it is important to minimize the impact on the original mission while avoiding noncooperative targets.On the other hand,the existing avoidance algorithms are computationally complex and time-consuming especially with the limited computing capability of the on-board computer,posing challenges for practical engineering applications.To conquer these difficulties,this paper makes the following key contributions:(A)a turn-based(sequential decision-making)limited-area impulsive collision avoidance model considering the time delay of precision orbit determination is established for the first time;(B)a novel Selection Probability Learning Adaptive Search-depth Search Tree(SPL-ASST)algorithm is proposed for non-cooperative target avoidance,which improves the decision-making efficiency by introducing an adaptive-search-depth mechanism and a neural network into the traditional Monte Carlo Tree Search(MCTS).Numerical simulations confirm the effectiveness and efficiency of the proposed method. 展开更多
关键词 Non-cooperative target Collision avoidance Limited motion area Impulsive maneuver model Search tree algorithm Neural networks
原文传递
A Class of Parallel Algorithm for Solving Low-rank Tensor Completion
20
作者 LIU Tingyan WEN Ruiping 《应用数学》 北大核心 2025年第4期1134-1144,共11页
In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice ... In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice matrix under unfold operator,and then the fold operator is used to form the next iteration tensor such that the computing time can be decreased.In theory,we analyze the global convergence of the algorithm.In numerical experiment,the simulation data and real image inpainting are carried out.Experiment results show the parallel algorithm outperform its original algorithm in CPU times under the same precision. 展开更多
关键词 Tensor completion Low-rank CONVERGENCE Parallel algorithm
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部