期刊文献+
共找到282,678篇文章
< 1 2 250 >
每页显示 20 50 100
基于PSO-SVR算法的水泥窑SCR催化剂磨损率预测
1
作者 印心如 李明月 吴越 《价值工程》 2026年第1期10-12,共3页
为精准预测水泥窑SCR催化剂磨损率,提出PSO-SVR预测方法,即借助粒子群算法优化支持向量回归机的参数。算法对比结果显示,PSO-SVR模型预测效果优于SVR模型。PSO-SVR模型结果误差更小、预测精度更高,能有效预测水泥窑烟气SCR脱硝催化剂磨... 为精准预测水泥窑SCR催化剂磨损率,提出PSO-SVR预测方法,即借助粒子群算法优化支持向量回归机的参数。算法对比结果显示,PSO-SVR模型预测效果优于SVR模型。PSO-SVR模型结果误差更小、预测精度更高,能有效预测水泥窑烟气SCR脱硝催化剂磨损特性。 展开更多
关键词 水泥窑SCR pso-svr算法 磨损率
在线阅读 下载PDF
基于PSO-SVR算法的钢板-混凝土组合连梁承载力预测 被引量:2
2
作者 田建勃 闫靖帅 +2 位作者 王晓磊 赵勇 史庆轩 《振动与冲击》 北大核心 2025年第7期155-162,共8页
为准确预测钢板-混凝土组合(steel plate-RC composite,PRC)连梁承载力,本文分别通过支持向量机回归算法(support vector regression,SVR)、极端梯度提升算法(XGBoost)和粒子群优化的支持向量机回归(particle swarm optimization-suppor... 为准确预测钢板-混凝土组合(steel plate-RC composite,PRC)连梁承载力,本文分别通过支持向量机回归算法(support vector regression,SVR)、极端梯度提升算法(XGBoost)和粒子群优化的支持向量机回归(particle swarm optimization-support vector regression,PSO-SVR)算法进行了PRC连梁试验数据的回归训练,此外,通过使用Sobol敏感性分析方法分析了数据特征参数对PRC连梁承载力的影响。结果表明,基于SVR、极端梯度提升算法(extreme gradient boosting,XGBoost)和PSO-SVR的预测模型平均绝对百分比误差分别为5.48%、7.65%和4.80%,其中,基于PSO-SVR算法的承载力预测模型具有最高的预测精度,模型的鲁棒性和泛化能力更强。此外,特征参数钢板率(ρ_(p))、截面高度(h)和连梁跨高比(l_(n)/h)对PRC连梁承载力影响最大,三者全局影响指数总和超过0.75,其中,钢板率(ρ_(p))是对PRC连梁承载力影响最大的单一因素,一阶敏感性指数和全局敏感性指数分别为0.3423和0.3620,以期为PRC连梁在实际工程中的设计及应用提供参考。 展开更多
关键词 钢板-混凝土组合连梁 机器学习 粒子群优化的支持向量机回归(pso-svr)算法 承载力 敏感性分析
在线阅读 下载PDF
基于PSO-SVR模型的长三角城市群碳达峰情景模拟研究
3
作者 吴冠岑 章聪颖 +1 位作者 廖金星 牛星 《生态经济》 北大核心 2025年第7期41-48,共8页
论文基于2006—2019年长三角城市群数据,使用PSO-SVR模型并结合情景分析法对基准情景、产业优化情景、土地集约利用情景和碳减排强化情景等8种情景下长三角城市群的碳排放进行预测。结果表明:(1)PSO-SVR模型能够有效提升模型的预测精度... 论文基于2006—2019年长三角城市群数据,使用PSO-SVR模型并结合情景分析法对基准情景、产业优化情景、土地集约利用情景和碳减排强化情景等8种情景下长三角城市群的碳排放进行预测。结果表明:(1)PSO-SVR模型能够有效提升模型的预测精度。(2)长三角城市群在预测期内的碳排放呈现明显的倒“U”型趋势;(3)基准情景下长三角城市群将在2033年达峰,碳减排强化情景将在2031年实现碳达峰,其他情景均在2032年实现碳达峰。研究为长三角城市群在兼顾社会经济稳定发展前提下早日实现“双碳”目标提供了参考。 展开更多
关键词 碳达峰 pso-svr模型 情景模拟 长三角城市群
原文传递
基于PSO-SVR和NSGA-Ⅲ的高温合金冷成形螺栓工艺优化
4
作者 王辉 董万鹏 +4 位作者 何建丽 张宇杰 秦龙 王志海 陈飞 《塑性工程学报》 北大核心 2025年第11期63-76,共14页
利用Deform建立了冷镦挤与冷顶镦成形模型,结合全因子实验法,通过构建综合评价指标,采用粒子群优化-支持向量回归(PSO-SVR)模型对工艺参数进行了预测,运用NSGA-Ⅲ算法求解Pareto最优解集,实现多目标下工艺参数的全局优化。确定最优工艺... 利用Deform建立了冷镦挤与冷顶镦成形模型,结合全因子实验法,通过构建综合评价指标,采用粒子群优化-支持向量回归(PSO-SVR)模型对工艺参数进行了预测,运用NSGA-Ⅲ算法求解Pareto最优解集,实现多目标下工艺参数的全局优化。确定最优工艺参数组合:镦挤速度为8.96 mm·s^(-1),缩颈区域的凹模上圆角半径为3.1 mm,凹模下圆角半径为3.0 mm;顶镦速度为3.54 mm·s^(-1),上模型腔内十二角圆角半径为0.94 mm,法兰盘肩部圆角半径为0.80 mm。结果表明,与同类型冷成形螺栓相比,等效应力、等效应变、剪切应力和剪切应变均匀性系数降低幅度分别为8.73%、9.63%、2.96%和10.5%,与同类型热成形螺栓相比,等效应变与剪切应变均匀性系数降低幅度分别为14.04%与32.68%。 展开更多
关键词 GH4169合金 冷成形 pso-svr NSGA-Ⅲ 均匀性
原文传递
基于PSO-SVR的多维物流需求预测——以大连港为例
5
作者 徐梓菲 王宇楠 张馨 《珠江水运》 2025年第9期124-127,共4页
大连港作为我国东北地区最重要的交通枢纽之一,对我国港口物流发展具有极其重要的影响,而对大连港的货物吞吐量进行物流需求预测是其物流发展规划的重要组成部分。但是在现实生活中,物流需求可能受到腹地经济总量、腹地经济结构、腹地... 大连港作为我国东北地区最重要的交通枢纽之一,对我国港口物流发展具有极其重要的影响,而对大连港的货物吞吐量进行物流需求预测是其物流发展规划的重要组成部分。但是在现实生活中,物流需求可能受到腹地经济总量、腹地经济结构、腹地经济贸易等多方面因素的影响,这使得问题呈现出多维度的特征。随着机器学习发展,越来越多的研究开始尝试使用神经网络模型进行物流需求预测,但是单一的神经网络模型在处理多维度的预测任务时常常表现欠佳。由此文章提出了一种基于PSO-SVR的预测模型,用于多维物流需求预测。通过实验与其他模型的对比,结果表明,绝对误差(MAE)、均方根误差(RMSE)、平均百分比误差(MAPE)分别为0.75、0.75和0.839,均优于GA-SVR模型,能较好预测大连港未来的物流需求。 展开更多
关键词 多维物流需求预测 pso-svr模型 大连港
在线阅读 下载PDF
Method for Estimating the State of Health of Lithium-ion Batteries Based on Differential Thermal Voltammetry and Sparrow Search Algorithm-Elman Neural Network 被引量:1
6
作者 Yu Zhang Daoyu Zhang TiezhouWu 《Energy Engineering》 EI 2025年第1期203-220,共18页
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr... Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%. 展开更多
关键词 Lithium-ion battery state of health differential thermal voltammetry Sparrow Search algorithm
在线阅读 下载PDF
Robustness Optimization Algorithm with Multi-Granularity Integration for Scale-Free Networks Against Malicious Attacks 被引量:1
7
作者 ZHANG Yiheng LI Jinhai 《昆明理工大学学报(自然科学版)》 北大核心 2025年第1期54-71,共18页
Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently... Complex network models are frequently employed for simulating and studyingdiverse real-world complex systems.Among these models,scale-free networks typically exhibit greater fragility to malicious attacks.Consequently,enhancing the robustness of scale-free networks has become a pressing issue.To address this problem,this paper proposes a Multi-Granularity Integration Algorithm(MGIA),which aims to improve the robustness of scale-free networks while keeping the initial degree of each node unchanged,ensuring network connectivity and avoiding the generation of multiple edges.The algorithm generates a multi-granularity structure from the initial network to be optimized,then uses different optimization strategies to optimize the networks at various granular layers in this structure,and finally realizes the information exchange between different granular layers,thereby further enhancing the optimization effect.We propose new network refresh,crossover,and mutation operators to ensure that the optimized network satisfies the given constraints.Meanwhile,we propose new network similarity and network dissimilarity evaluation metrics to improve the effectiveness of the optimization operators in the algorithm.In the experiments,the MGIA enhances the robustness of the scale-free network by 67.6%.This improvement is approximately 17.2%higher than the optimization effects achieved by eight currently existing complex network robustness optimization algorithms. 展开更多
关键词 complex network model MULTI-GRANULARITY scale-free networks ROBUSTNESS algorithm integration
原文传递
基于PSO-SVR的精轧带钢头部跑偏预测
8
作者 王合财 张亮 +2 位作者 戴卓浩 王晓晨 何海楠 《河北冶金》 2025年第8期49-54,共6页
在热连轧带钢生产中,精轧机组出口处带钢头部的跑偏是影响产品质量与轧制稳定性的主要问题。高精度的跑偏量预测是实现有效前馈控制、减少质量缺陷和生产事故的前提。然而,现有预测方法的精度常难以满足控制要求。针对精轧带钢头部出口... 在热连轧带钢生产中,精轧机组出口处带钢头部的跑偏是影响产品质量与轧制稳定性的主要问题。高精度的跑偏量预测是实现有效前馈控制、减少质量缺陷和生产事故的前提。然而,现有预测方法的精度常难以满足控制要求。针对精轧带钢头部出口跑偏量预测精度不足的问题,提出了一种基于PSO-SVR的精轧带钢头部出口跑偏预测模型。该模型主要从来料因素、轧机因素和轧制过程改变量3个维度进行设置,利用工业现场采集的2000组有效样本数据构建了数据集,对模型进行训练和测试。实验结果表明,所提出的PSO-SVR模型预测精度高达87%。为充分验证模型的有效性,将其预测性能与传统SVR模型、多项式回归模型以及极端梯度提升(XGBoost)模型进行了对比。对比结果一致表明,PSO-SVR模型在预测精度上具有显著优越性。本研究构建的PSO-SVR预测模型显著提高了带钢头部出口跑偏量的预测精度,为工业现场实现精准的前馈控制提供了可靠的理论支撑与数据基础。 展开更多
关键词 热轧 跑偏 预测精度 pso-svr 核函数
在线阅读 下载PDF
Short-TermWind Power Forecast Based on STL-IAOA-iTransformer Algorithm:A Case Study in Northwest China 被引量:2
9
作者 Zhaowei Yang Bo Yang +5 位作者 Wenqi Liu Miwei Li Jiarong Wang Lin Jiang Yiyan Sang Zhenning Pan 《Energy Engineering》 2025年第2期405-430,共26页
Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,th... Accurate short-term wind power forecast technique plays a crucial role in maintaining the safety and economic efficiency of smart grids.Although numerous studies have employed various methods to forecast wind power,there remains a research gap in leveraging swarm intelligence algorithms to optimize the hyperparameters of the Transformer model for wind power prediction.To improve the accuracy of short-term wind power forecast,this paper proposes a hybrid short-term wind power forecast approach named STL-IAOA-iTransformer,which is based on seasonal and trend decomposition using LOESS(STL)and iTransformer model optimized by improved arithmetic optimization algorithm(IAOA).First,to fully extract the power data features,STL is used to decompose the original data into components with less redundant information.The extracted components as well as the weather data are then input into iTransformer for short-term wind power forecast.The final predicted short-term wind power curve is obtained by combining the predicted components.To improve the model accuracy,IAOA is employed to optimize the hyperparameters of iTransformer.The proposed approach is validated using real-generation data from different seasons and different power stations inNorthwest China,and ablation experiments have been conducted.Furthermore,to validate the superiority of the proposed approach under different wind characteristics,real power generation data fromsouthwestChina are utilized for experiments.Thecomparative results with the other six state-of-the-art prediction models in experiments show that the proposed model well fits the true value of generation series and achieves high prediction accuracy. 展开更多
关键词 Short-termwind power forecast improved arithmetic optimization algorithm iTransformer algorithm SimuNPS
在线阅读 下载PDF
A LODBO algorithm for multi-UAV search and rescue path planning in disaster areas 被引量:1
10
作者 Liman Yang Xiangyu Zhang +2 位作者 Zhiping Li Lei Li Yan Shi 《Chinese Journal of Aeronautics》 2025年第2期200-213,共14页
In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms... In disaster relief operations,multiple UAVs can be used to search for trapped people.In recent years,many researchers have proposed machine le arning-based algorithms,sampling-based algorithms,and heuristic algorithms to solve the problem of multi-UAV path planning.The Dung Beetle Optimization(DBO)algorithm has been widely applied due to its diverse search patterns in the above algorithms.However,the update strategies for the rolling and thieving dung beetles of the DBO algorithm are overly simplistic,potentially leading to an inability to fully explore the search space and a tendency to converge to local optima,thereby not guaranteeing the discovery of the optimal path.To address these issues,we propose an improved DBO algorithm guided by the Landmark Operator(LODBO).Specifically,we first use tent mapping to update the population strategy,which enables the algorithm to generate initial solutions with enhanced diversity within the search space.Second,we expand the search range of the rolling ball dung beetle by using the landmark factor.Finally,by using the adaptive factor that changes with the number of iterations.,we improve the global search ability of the stealing dung beetle,making it more likely to escape from local optima.To verify the effectiveness of the proposed method,extensive simulation experiments are conducted,and the result shows that the LODBO algorithm can obtain the optimal path using the shortest time compared with the Genetic Algorithm(GA),the Gray Wolf Optimizer(GWO),the Whale Optimization Algorithm(WOA)and the original DBO algorithm in the disaster search and rescue task set. 展开更多
关键词 Unmanned aerial vehicle Path planning Meta heuristic algorithm DBO algorithm NP-hard problems
原文传递
基于IPSO-SVR算法的结构钢板激光弯曲成形预测
11
作者 李文翠 《山西冶金》 2025年第2期168-169,172,共3页
为了提高高层建筑结构钢板材的激光弯曲成形质量,通过改进粒子群优化算法(PSO)对向量回归机(SVR)进行优化,并对钢板激光弯曲成形加工参数进行组合优化,获得满足板材弯曲角的最优组合参数。研究结果表明:无论是激光扫描速度还是激光功率... 为了提高高层建筑结构钢板材的激光弯曲成形质量,通过改进粒子群优化算法(PSO)对向量回归机(SVR)进行优化,并对钢板激光弯曲成形加工参数进行组合优化,获得满足板材弯曲角的最优组合参数。研究结果表明:无论是激光扫描速度还是激光功率,采用经过训练的模型预测与样本真实值很接近,证明该模型的准确性。扫描速度预测相对误差介于0.29%~7.41%,功率预测误差为1.26%~5.13%,该模型能较好地预测出激光弯曲成形能力。该研究有助于提高钢板的成形效率,具有很好的实际价值。 展开更多
关键词 结构钢板 激光弯曲 Ipso-svr算法 预测误差
在线阅读 下载PDF
Research on Euclidean Algorithm and Reection on Its Teaching
12
作者 ZHANG Shaohua 《应用数学》 北大核心 2025年第1期308-310,共3页
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t... In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching. 展开更多
关键词 Euclid's algorithm Division algorithm Bezout's equation
在线阅读 下载PDF
DDoS Attack Autonomous Detection Model Based on Multi-Strategy Integrate Zebra Optimization Algorithm
13
作者 Chunhui Li Xiaoying Wang +2 位作者 Qingjie Zhang Jiaye Liang Aijing Zhang 《Computers, Materials & Continua》 SCIE EI 2025年第1期645-674,共30页
Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol... Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score. 展开更多
关键词 Distributed denial of service attack intrusion detection deep learning zebra optimization algorithm multi-strategy integrated zebra optimization algorithm
在线阅读 下载PDF
Bearing capacity prediction of open caissons in two-layered clays using five tree-based machine learning algorithms 被引量:1
14
作者 Rungroad Suppakul Kongtawan Sangjinda +3 位作者 Wittaya Jitchaijaroen Natakorn Phuksuksakul Suraparb Keawsawasvong Peem Nuaklong 《Intelligent Geoengineering》 2025年第2期55-65,共11页
Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered so... Open caissons are widely used in foundation engineering because of their load-bearing efficiency and adaptability in diverse soil conditions.However,accurately predicting their undrained bearing capacity in layered soils remains a complex challenge.This study presents a novel application of five ensemble machine(ML)algorithms-random forest(RF),gradient boosting machine(GBM),extreme gradient boosting(XGBoost),adaptive boosting(AdaBoost),and categorical boosting(CatBoost)-to predict the undrained bearing capacity factor(Nc)of circular open caissons embedded in two-layered clay on the basis of results from finite element limit analysis(FELA).The input dataset consists of 1188 numerical simulations using the Tresca failure criterion,varying in geometrical and soil parameters.The FELA was performed via OptumG2 software with adaptive meshing techniques and verified against existing benchmark studies.The ML models were trained on 70% of the dataset and tested on the remaining 30%.Their performance was evaluated using six statistical metrics:coefficient of determination(R²),mean absolute error(MAE),root mean squared error(RMSE),index of scatter(IOS),RMSE-to-standard deviation ratio(RSR),and variance explained factor(VAF).The results indicate that all the models achieved high accuracy,with R²values exceeding 97.6%and RMSE values below 0.02.Among them,AdaBoost and CatBoost consistently outperformed the other methods across both the training and testing datasets,demonstrating superior generalizability and robustness.The proposed ML framework offers an efficient,accurate,and data-driven alternative to traditional methods for estimating caisson capacity in stratified soils.This approach can aid in reducing computational costs while improving reliability in the early stages of foundation design. 展开更多
关键词 Two-layered clay Open caisson Tree-based algorithms FELA Machine learning
在线阅读 下载PDF
Path Planning for Thermal Power Plant Fan Inspection Robot Based on Improved A^(*)Algorithm 被引量:1
15
作者 Wei Zhang Tingfeng Zhang 《Journal of Electronic Research and Application》 2025年第1期233-239,共7页
To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The... To improve the efficiency and accuracy of path planning for fan inspection tasks in thermal power plants,this paper proposes an intelligent inspection robot path planning scheme based on an improved A^(*)algorithm.The inspection robot utilizes multiple sensors to monitor key parameters of the fans,such as vibration,noise,and bearing temperature,and upload the data to the monitoring center.The robot’s inspection path employs the improved A^(*)algorithm,incorporating obstacle penalty terms,path reconstruction,and smoothing optimization techniques,thereby achieving optimal path planning for the inspection robot in complex environments.Simulation results demonstrate that the improved A^(*)algorithm significantly outperforms the traditional A^(*)algorithm in terms of total path distance,smoothness,and detour rate,effectively improving the execution efficiency of inspection tasks. 展开更多
关键词 Power plant fans Inspection robot Path planning Improved A^(*)algorithm
在线阅读 下载PDF
Rapid pathologic grading-based diagnosis of esophageal squamous cell carcinoma via Raman spectroscopy and a deep learning algorithm 被引量:1
16
作者 Xin-Ying Yu Jian Chen +2 位作者 Lian-Yu Li Feng-En Chen Qiang He 《World Journal of Gastroenterology》 2025年第14期32-46,共15页
BACKGROUND Esophageal squamous cell carcinoma is a major histological subtype of esophageal cancer.Many molecular genetic changes are associated with its occurrence.Raman spectroscopy has become a new method for the e... BACKGROUND Esophageal squamous cell carcinoma is a major histological subtype of esophageal cancer.Many molecular genetic changes are associated with its occurrence.Raman spectroscopy has become a new method for the early diagnosis of tumors because it can reflect the structures of substances and their changes at the molecular level.AIM To detect alterations in Raman spectral information across different stages of esophageal neoplasia.METHODS Different grades of esophageal lesions were collected,and a total of 360 groups of Raman spectrum data were collected.A 1D-transformer network model was proposed to handle the task of classifying the spectral data of esophageal squamous cell carcinoma.In addition,a deep learning model was applied to visualize the Raman spectral data and interpret their molecular characteristics.RESULTS A comparison among Raman spectral data with different pathological grades and a visual analysis revealed that the Raman peaks with significant differences were concentrated mainly at 1095 cm^(-1)(DNA,symmetric PO,and stretching vibration),1132 cm^(-1)(cytochrome c),1171 cm^(-1)(acetoacetate),1216 cm^(-1)(amide III),and 1315 cm^(-1)(glycerol).A comparison among the training results of different models revealed that the 1Dtransformer network performed best.A 93.30%accuracy value,a 96.65%specificity value,a 93.30%sensitivity value,and a 93.17%F1 score were achieved.CONCLUSION Raman spectroscopy revealed significantly different waveforms for the different stages of esophageal neoplasia.The combination of Raman spectroscopy and deep learning methods could significantly improve the accuracy of classification. 展开更多
关键词 Raman spectroscopy Esophageal neoplasia Early diagnosis Deep learning algorithm Rapid pathologic grading
暂未订购
An Algorithm for Cloud-based Web Service Combination Optimization Through Plant Growth Simulation
17
作者 Li Qiang Qin Huawei +1 位作者 Qiao Bingqin Wu Ruifang 《系统仿真学报》 北大核心 2025年第2期462-473,共12页
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base... In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm. 展开更多
关键词 cloud-based service scheduling algorithm resource constraint load optimization cloud computing plant growth simulation algorithm
原文传递
Improved algorithm of multi-mainlobe interference suppression under uncorrelated and coherent conditions 被引量:1
18
作者 CAI Miaohong CHENG Qiang +1 位作者 MENG Jinli ZHAO Dehua 《Journal of Southeast University(English Edition)》 2025年第1期84-90,共7页
A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the s... A new method based on the iterative adaptive algorithm(IAA)and blocking matrix preprocessing(BMP)is proposed to study the suppression of multi-mainlobe interference.The algorithm is applied to precisely estimate the spatial spectrum and the directions of arrival(DOA)of interferences to overcome the drawbacks associated with conventional adaptive beamforming(ABF)methods.The mainlobe interferences are identified by calculating the correlation coefficients between direction steering vectors(SVs)and rejected by the BMP pretreatment.Then,IAA is subsequently employed to reconstruct a sidelobe interference-plus-noise covariance matrix for the preferable ABF and residual interference suppression.Simulation results demonstrate the excellence of the proposed method over normal methods based on BMP and eigen-projection matrix perprocessing(EMP)under both uncorrelated and coherent circumstances. 展开更多
关键词 mainlobe interference suppression adaptive beamforming spatial spectral estimation iterative adaptive algorithm blocking matrix preprocessing
在线阅读 下载PDF
Intelligent sequential multi-impulse collision avoidance method for non-cooperative spacecraft based on an improved search tree algorithm 被引量:1
19
作者 Xuyang CAO Xin NING +4 位作者 Zheng WANG Suyi LIU Fei CHENG Wenlong LI Xiaobin LIAN 《Chinese Journal of Aeronautics》 2025年第4期378-393,共16页
The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making co... The problem of collision avoidance for non-cooperative targets has received significant attention from researchers in recent years.Non-cooperative targets exhibit uncertain states and unpredictable behaviors,making collision avoidance significantly more challenging than that for space debris.Much existing research focuses on the continuous thrust model,whereas the impulsive maneuver model is more appropriate for long-duration and long-distance avoidance missions.Additionally,it is important to minimize the impact on the original mission while avoiding noncooperative targets.On the other hand,the existing avoidance algorithms are computationally complex and time-consuming especially with the limited computing capability of the on-board computer,posing challenges for practical engineering applications.To conquer these difficulties,this paper makes the following key contributions:(A)a turn-based(sequential decision-making)limited-area impulsive collision avoidance model considering the time delay of precision orbit determination is established for the first time;(B)a novel Selection Probability Learning Adaptive Search-depth Search Tree(SPL-ASST)algorithm is proposed for non-cooperative target avoidance,which improves the decision-making efficiency by introducing an adaptive-search-depth mechanism and a neural network into the traditional Monte Carlo Tree Search(MCTS).Numerical simulations confirm the effectiveness and efficiency of the proposed method. 展开更多
关键词 Non-cooperative target Collision avoidance Limited motion area Impulsive maneuver model Search tree algorithm Neural networks
原文传递
A Class of Parallel Algorithm for Solving Low-rank Tensor Completion
20
作者 LIU Tingyan WEN Ruiping 《应用数学》 北大核心 2025年第4期1134-1144,共11页
In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice ... In this paper,we established a class of parallel algorithm for solving low-rank tensor completion problem.The main idea is that N singular value decompositions are implemented in N different processors for each slice matrix under unfold operator,and then the fold operator is used to form the next iteration tensor such that the computing time can be decreased.In theory,we analyze the global convergence of the algorithm.In numerical experiment,the simulation data and real image inpainting are carried out.Experiment results show the parallel algorithm outperform its original algorithm in CPU times under the same precision. 展开更多
关键词 Tensor completion Low-rank CONVERGENCE Parallel algorithm
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部