期刊文献+
共找到280,364篇文章
< 1 2 250 >
每页显示 20 50 100
基于CNN-BiLSTM-Attention的油气井固井施工参数监测与预测研究 被引量:1
1
作者 田军政 谢雄武 +2 位作者 钱坤 刘长春 马业 《计算机测量与控制》 2025年第2期54-62,共9页
在深水、深井和超深井油气勘探领域,油气井固井施工面临着作业危险高、劳动强度大等多重挑战,导致油气井固井施工参数监测与进度预测难;为了解决这些问题,对基于云边协同和深度学习的油气井固井施工关键参数监测与进度预测进行了研究;... 在深水、深井和超深井油气勘探领域,油气井固井施工面临着作业危险高、劳动强度大等多重挑战,导致油气井固井施工参数监测与进度预测难;为了解决这些问题,对基于云边协同和深度学习的油气井固井施工关键参数监测与进度预测进行了研究;通过云边协同组网,在现场采集和存储固井流量、压力、温度等数据,并利用MQTT轻量化通讯协议网络进行远程传输;研究基于CNN-BiLSTM-Attention网络的油气井固井施工进度预测数学模型,通过CNN网络提取油气井固井施工进度的关键特征要素,基于BiLSTM挖掘关键特征要素之间的关联关系,运用Attention机制对重要特征进行权重分配,以便抽取出更加关键及重要的油气井固井施工进度信息;经实验测试实现了油气井参数监测与预测的功能,表明所提方法具有明显的预测精度优势,且云边协同平台可以实时反映油气井固井施工过程中的各项关键参数。 展开更多
关键词 云边协同 cnn-bilstm-attention 油气井固井 施工关键参数监测 施工进度预测
在线阅读 下载PDF
基于CNN-BiLSTM-Attention的重力坝稳定时变安全系数预测模型
2
作者 曹宇鑫 张瀚 +1 位作者 尹金超 李亚楠 《人民珠江》 2025年第4期1-8,共8页
在高水压和高渗压等复杂工况作用下,准确把握重力坝安全系数的时变规律并进行有效预测,对于大坝运行状态的科学管控至关重要。为此,基于深度学习理论的CNN-BiLSTM-Attention方法,提出以上游水位、坝顶顺河向位移、时效为自变量,抗滑稳... 在高水压和高渗压等复杂工况作用下,准确把握重力坝安全系数的时变规律并进行有效预测,对于大坝运行状态的科学管控至关重要。为此,基于深度学习理论的CNN-BiLSTM-Attention方法,提出以上游水位、坝顶顺河向位移、时效为自变量,抗滑稳定系数为因变量的耦联预测模型。通过对某坝高148.0 m的重力坝工程分析,模型的拟合平均绝对误差(Mean Absolute Error,MAE)和均方误差(Root Mean Square Error,RMSE)为1.12×10-3和1.66×10-3,预测误差MAE、RMSE分别为3.08×10-3和3.53×10-3,与传统统计回归方法相比,预测精度提高了51.80%和45.44%,与SVM(Support Vector Machine)算法相比,预测精度提高了16.08%和10.18%,显示出对有限元计算结果曲线更好的吻合度,预测精度优势也较为明显。 展开更多
关键词 cnn-bilstm-attention 重力坝 预警指标 预测模型
在线阅读 下载PDF
以霜冰优化算法优化CNN-BiLSTM-Attention的参考蒸散量估算
3
作者 付桐林 金晶 《中国沙漠》 北大核心 2025年第3期302-312,共11页
有限气象参数条件下借助于深度学习实现蒸散量的准确估算对干旱区有限水资源的高效利用和管理具有重要意义。当前基于混合深度学习模型CNN-Bi LSTM-Attention的蒸散发估算忽视了参数优化,导致估算精度难以契合实际应用需求。本文提出了... 有限气象参数条件下借助于深度学习实现蒸散量的准确估算对干旱区有限水资源的高效利用和管理具有重要意义。当前基于混合深度学习模型CNN-Bi LSTM-Attention的蒸散发估算忽视了参数优化,导致估算精度难以契合实际应用需求。本文提出了一种新的霜冰优化算法(RIME)优化CNN-Bi LSTM-Attention的超参数的混合模型RIME-CNN-Bi LSTM-Attention,实现了有限气象参数条件下临泽县参考蒸散量(ET_(0))的准确预测。与CNN-Bi LSTM-Attention相比,混合模型RIME-CNN-Bi LSTM-Attention的平均绝对百分比误差(MAPE)从14.56%下降到14.09%,可决系数从0.8654上升到0.8930。此外,数值结果表明混合模型RIME-CNN-Bi LSTM-Attention的模型性能优于分别采用哈里斯鹰优化算法(HHO)、鱼鹰优化算法(OOA)、北方苍鹰算法(NGO)对CNN-Bi LSTM-Attention进行优化的混合模型HHO-CNN-Bi LSTM-Attention、OOA-CNN-Bi LSTM-Attention、NGO-CNN-Bi LSTM-Attention,意味着所构建混合模型RIME-CNN-Bi LSTM-Attention具有更加稳健的模型性能和更高的计算精度,能够实现研究区域ET_(0)的准确估算。 展开更多
关键词 参考蒸散量 霜冰优化算法 卷积神经网络 双向长短期记忆网络 注意力机制
原文传递
基于CNN-BiLSTM-Attention的工业数据中心IT设备能耗预测模型研究
4
作者 宋越 靳晟 +2 位作者 林栎 高国强 郭付展 《电子技术应用》 2025年第10期63-68,共6页
IT设备的能耗直接影响到工业数据中心的电力消耗,预测IT设备能耗对优化能源管理和资源规划具有重要意义。然而,由于IT能耗数据呈现出非线性、非平稳的特点,导致预测精度低。对此,结合卷积神经网络CNN、双向长短期记忆网络BiLSTM和注意... IT设备的能耗直接影响到工业数据中心的电力消耗,预测IT设备能耗对优化能源管理和资源规划具有重要意义。然而,由于IT能耗数据呈现出非线性、非平稳的特点,导致预测精度低。对此,结合卷积神经网络CNN、双向长短期记忆网络BiLSTM和注意力机制的优势,分别对IT设备能耗的局部特征、数据中深层次的关键信息进行提取,并根据自测IT设备能耗数据集构建基于CNN-BiLSTM-Attention的能耗预测模型,该模型的R2、MAE和RMSE分别为0.9053、0.0504、0.0673,相较于现有的LSTM、BiLSTM和CNN-BiLSTM模型均有不同程度的提高,说明该模型可以应用于工业数据中心内IT设备能耗的准确预测。 展开更多
关键词 能耗预测模型 cnn-bilstm-attention 工业数据中心 深度学习
在线阅读 下载PDF
基于IPOA-MSCNN-BiLSTM-Attention模型的刀具磨损状态识别
5
作者 杨焕峥 崔业梅 +1 位作者 薛洪惠 徐玲 《组合机床与自动化加工技术》 北大核心 2025年第7期158-163,共6页
刀具状态监测直接影响产品加工质量,为了提高刀具磨损状态识别的准确性,构建了IPOA-MSCNN-BiLSTM-Attention模型。首先,采用多尺度卷积神经网络(MSCNN)和双向长短时记忆网络(BiLSTM)来学习数据的时空特征;其次,引入注意力机制(Attention... 刀具状态监测直接影响产品加工质量,为了提高刀具磨损状态识别的准确性,构建了IPOA-MSCNN-BiLSTM-Attention模型。首先,采用多尺度卷积神经网络(MSCNN)和双向长短时记忆网络(BiLSTM)来学习数据的时空特征;其次,引入注意力机制(Attention)以增强对关键信息的关注度;再次,提出了一种改进的鹈鹕优化算法(IPOA),用于优化模型多尺度卷积神经网络的参数。该算法结合自适应惯性权重因子、柯西变异和麻雀警戒机制策略,在CEC2005至CEC2022的众多函数性能测试中综合表现优于传统POA等5种算法;最后,在工业控制计算机(IPC)上运行了模型。结果表明,该模型在刀具磨损状态识别方面表现出较高的识别精度,可提高加工安全与生产效率。 展开更多
关键词 刀具磨损 状态监测 改进的鹈鹕优化算法 多尺度卷积神经网络 双向长短时记忆网络
在线阅读 下载PDF
基于改进SSA和CNN-BiLSTM-Attention的UWB测距误差缓解算法
6
作者 张翠 刘津铭 +1 位作者 郑新鹏 张烈平 《电子测量技术》 北大核心 2025年第10期51-61,共11页
针对超宽带在实际环境下存在的测距误差问题,提出了基于改进麻雀搜索算法和卷积双向长短期注意力模型的超宽带测距误差缓解算法。采用Tent映射,利用自适应调整方法,结合北方苍鹰算法,并在跟随阶段采用螺旋飞行策略对SSA算法改进,提高算... 针对超宽带在实际环境下存在的测距误差问题,提出了基于改进麻雀搜索算法和卷积双向长短期注意力模型的超宽带测距误差缓解算法。采用Tent映射,利用自适应调整方法,结合北方苍鹰算法,并在跟随阶段采用螺旋飞行策略对SSA算法改进,提高算法的全局搜索性能,避免陷入局部最优的情况,将改进后的算法命名为TANSSSA。利用BiLSTM模型和注意力机制改进CNN-LSTM模型,构建CNN-BiLSTM-Attention模型,提高模型对长序列数据的处理能力,使得模型对数据有更精确的权重分配。使用TANSSSA优化CNN-BiLSTM-Attention模型的超参数,构建TANSSSA-CNN-BiLSTM-Attention模型。在模型性能验证实验中,对比SSA-CNN-BiLSTM-Attention、CNN-BiLSTM-Attention、CNN-BiLSTM、CNN-LSTM-Attention、CNN-LSTM、GRU以及TCN模型,平均绝对误差降低了12.05%~62.31%。在实际环境中,TANSSSA-CNN-BiLSTM-Attention对比其他7种模型,平均绝对误差降低了45.70%~83.82%,测距误差得到有效地缓解。 展开更多
关键词 超宽带 麻雀搜索算法 卷积神经网络 双向长短期记忆网络 注意力机制
原文传递
基于改进蜣螂算法优化CNN-BiLSTM-Attention的串联电弧故障检测方法 被引量:3
7
作者 李海波 《电器与能效管理技术》 2024年第8期57-68,共12页
针对故障电弧特征提取不足、检测精度不高等问题,提出一种多特征融合的改进蜣螂算法(IDBO)优化融合注意力(Attention)机制的卷积神经网络(CNN)和双向长短期记忆(BiLSTM)神经网络的串联电弧故障检测方法。通过实验平台提取电流的时域、... 针对故障电弧特征提取不足、检测精度不高等问题,提出一种多特征融合的改进蜣螂算法(IDBO)优化融合注意力(Attention)机制的卷积神经网络(CNN)和双向长短期记忆(BiLSTM)神经网络的串联电弧故障检测方法。通过实验平台提取电流的时域、频域、时频域以及信号自回归参数模型特征;利用核主成分分析(KPCA)对特征进行降维融合,并将求取的特征向量作为CNN-BiLSTM-Attention的输入向量;引入Cubic混沌映射、螺旋搜索策略、动态权重系数、高斯柯西变异策略对蜣螂算法进行改进,利用改进蜣螂算法对CNN-BiLSTM-Attention超参数优化实现串联电弧故障诊断。结果表明,所提方法故障电弧检测准确率达到97.92%,可高效识别串联电弧故障。 展开更多
关键词 电弧故障 改进蜣螂算法 多特征融合 cnn-bilstm-attention
在线阅读 下载PDF
GWO优化CNN-BiLSTM-Attention的风电功率预测模型研究
8
作者 王立涛 王帅杰 李潇潇 《黑龙江科学》 2025年第16期86-89,共4页
风电功率预测对电力系统经济运行及大规模风电接入至关重要。为提高预测精度,采用四分位间距法剔除异常值,利用线性插值处理缺失值并进行归一化,利用卷积神经网络(Convolutional Neural Network,CNN)提取空间局部特征,通过双向长短期记... 风电功率预测对电力系统经济运行及大规模风电接入至关重要。为提高预测精度,采用四分位间距法剔除异常值,利用线性插值处理缺失值并进行归一化,利用卷积神经网络(Convolutional Neural Network,CNN)提取空间局部特征,通过双向长短期记忆网络(Bidirectional Long Short-Term Memory Network,BiLSTM)捕捉时序依赖,采用自注意力机制动态分配特征权重,构建CNN-BiLSTM-Attention的风电功率预测组合模型,利用灰狼优化算法(Grey Wolf Optimizer,GWO)对组合模型的超参数进行优化,结合仿真结果和评价指标验证了模型的有效性。 展开更多
关键词 风电功率预测 灰狼优化算法 卷积神经网络 双向长短期记忆网络 自注意力机制
在线阅读 下载PDF
基于COA-CNN-BiLSTM-Attention的短期风电功率预测
9
作者 李立 《长春师范大学学报》 2025年第6期61-67,共7页
开展风电功率预测是提升风电并网稳定性、降低电网调峰调频压力的重要途径。单一的预测模型难以全面捕捉风电功率多种影响因素之间的复杂关系,为了提高预测结果的准确性和稳定性,本研究提出一种基于COA-CNN-BiLSTM-Attention的风电功率... 开展风电功率预测是提升风电并网稳定性、降低电网调峰调频压力的重要途径。单一的预测模型难以全面捕捉风电功率多种影响因素之间的复杂关系,为了提高预测结果的准确性和稳定性,本研究提出一种基于COA-CNN-BiLSTM-Attention的风电功率预测组合模型。首先使用皮尔逊(Pearson)相关系数对风电功率的影响因素进行筛选,然后通过卷积神经网络(CNN)进行空间特征提取,再将特征序列输入到双向长短期记忆网络(BiLSTM)中进行时序建模,最后利用浣熊优化算法(COA)对组合模型进行参数优化,并融合注意力机制(Attention)进行权重分配。实验结果表明,相较于其他三种模型,COA-CNN-BiLSTM-Attention的MAE降低了55.9%、41.5%和39.3%,RMSE降低了63.2%、53.6%和51.4%,R2提高了9.1%、5.2%和4.8%,验证了其在短期风电功率预测中的有效性和优越性。 展开更多
关键词 风电功率预测 卷积神经网络 双向长短期记忆网络 注意力机制 浣熊优化算法 组合模型
在线阅读 下载PDF
Method for Estimating the State of Health of Lithium-ion Batteries Based on Differential Thermal Voltammetry and Sparrow Search Algorithm-Elman Neural Network 被引量:1
10
作者 Yu Zhang Daoyu Zhang TiezhouWu 《Energy Engineering》 EI 2025年第1期203-220,共18页
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr... Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%. 展开更多
关键词 Lithium-ion battery state of health differential thermal voltammetry Sparrow Search algorithm
在线阅读 下载PDF
Robustness Optimization Algorithm with Multi-Granularity Integration for Scale-Free Networks Against Malicious Attacks 被引量:1
11
作者 ZHANG Yiheng LI Jinhai 《昆明理工大学学报(自然科学版)》 北大核心 2025年第1期54-71,共18页
Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently... Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms. 展开更多
关键词 complex network model MULTI-GRANULARITY scale-free networks ROBUSTNESS algorithm integration
原文传递
Short-TermWind Power Forecast Based on STL-IAOA-iTransformer Algorithm:A Case Study in Northwest China 被引量:2
12
作者 Zhaowei Yang Bo Yang +5 位作者 Wenqi Liu Miwei Li Jiarong Wang Lin Jiang Yiyan Sang Zhenning Pan 《Energy Engineering》 2025年第2期405-430,共26页
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th... Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy. 展开更多
关键词 Short-termwind power forecast improved arithmetic optimization algorithm iTransformer algorithm SimuNPS
在线阅读 下载PDF
A LODBO algorithm for multi-UAV search and rescue path planning in disaster areas 被引量:1
13
作者 Liman Yang Xiangyu Zhang +2 位作者 Zhiping Li Lei Li Yan Shi 《Chinese Journal of Aeronautics》 2025年第2期200-213,共14页
In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms... In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set. 展开更多
关键词 Unmanned aerial vehicle Path planning Meta heuristic algorithm DBO algorithm NP-hard problems
原文传递
基于CNN-BiLSTM-Attention模型的变电站设备运维故障诊断与预测研究
14
作者 刘磊 周毅 陈芝屹 《电力设备管理》 2024年第13期14-16,共3页
本研究旨在利用CNN-BiLSTM-Attention模型分析技术改进变电站设备运维的故障诊断与预测。随着大数据技术的发展,传统方法在处理电力设备故障问题的局限性愈发凸显。本研究采用CNN-BiLSTM-Attention模型的序列数据挖掘技术,提高了故障诊... 本研究旨在利用CNN-BiLSTM-Attention模型分析技术改进变电站设备运维的故障诊断与预测。随着大数据技术的发展,传统方法在处理电力设备故障问题的局限性愈发凸显。本研究采用CNN-BiLSTM-Attention模型的序列数据挖掘技术,提高了故障诊断的准确性和预测的及时性。试验结果表明,该方法在变电站设备运维故障诊断和预测场景下,具有较高准确性,但对于数据不均衡问题仍需进一步改进。 展开更多
关键词 变电站设备运维 故障诊断 cnn-bilstm-attention模型
在线阅读 下载PDF
Research on Euclidean Algorithm and Reection on Its Teaching
15
作者 ZHANG Shaohua 《应用数学》 北大核心 2025年第1期308-310,共3页
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t... In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching. 展开更多
关键词 Euclid's algorithm Division algorithm Bezout's equation
在线阅读 下载PDF
DDoS Attack Autonomous Detection Model Based on Multi-Strategy Integrate Zebra Optimization Algorithm
16
作者 Chunhui Li Xiaoying Wang +2 位作者 Qingjie Zhang Jiaye Liang Aijing Zhang 《Computers, Materials & Continua》 SCIE EI 2025年第1期645-674,共30页
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol... Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score. 展开更多
关键词 Distributed denial of service attack intrusion detection deep learning zebra optimization algorithm multi-strategy integrated zebra optimization algorithm
在线阅读 下载PDF
Bearing capacity prediction of open caissons in two-layered clays using five tree-based machine learning algorithms 被引量:1
17
作者 Rungroad Suppakul Kongtawan Sangjinda +3 位作者 Wittaya Jitchaijaroen Natakorn Phuksuksakul Suraparb Keawsawasvong Peem Nuaklong 《Intelligent Geoengineering》 2025年第2期55-65,共11页
Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered so... Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design. 展开更多
关键词 Two-layered clay Open caisson Tree-based algorithms FELA Machine learning
在线阅读 下载PDF
Path Planning for Thermal Power Plant Fan Inspection Robot Based on Improved A^(*)Algorithm 被引量:1
18
作者 Wei Zhang Tingfeng Zhang 《Journal of Electronic Research and Application》 2025年第1期233-239,共7页
To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The... To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks. 展开更多
关键词 Power plant fans Inspection robot Path planning Improved A^(*)algorithm
在线阅读 下载PDF
An Algorithm for Cloud-based Web Service Combination Optimization Through Plant Growth Simulation
19
作者 Li Qiang Qin Huawei +1 位作者 Qiao Bingqin Wu Ruifang 《系统仿真学报》 北大核心 2025年第2期462-473,共12页
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base... In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm. 展开更多
关键词 cloud-based service scheduling algorithm resource constraint load optimization cloud computing plant growth simulation algorithm
原文传递
Improved algorithm of multi-mainlobe interference suppression under uncorrelated and coherent conditions 被引量:1
20
作者 CAI Miaohong CHENG Qiang +1 位作者 MENG Jinli ZHAO Dehua 《Journal of Southeast University(English Edition)》 2025年第1期84-90,共7页
A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the s... A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the spatial spectrum and the directions of arrival(DOA)of interferences to overcome the drawbacks associated with conventional adaptive beamforming(ABF)methods.The mainlobe interferences are identified by calculating the correlation coefficients between direction steering vectors(SVs)and rejected by the BMP pretreatment.Then,IAA is subsequently employed to reconstruct a sidelobe interference-plus-noise covariance matrix for the preferable ABF and residual interference suppression.Simulation results demonstrate the excellence of the proposed method over normal methods based on BMP and eigen-projection matrix perprocessing(EMP)under both uncorrelated and coherent circumstances. 展开更多
关键词 mainlobe interference suppression adaptive beamforming spatial spectral estimation iterative adaptive algorithm blocking matrix preprocessing
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部