期刊文献+

改进的Transformer-LSTM模型在交通流量预测中的应用与优化 被引量:1

Application and Optimization of Improved Transformer-LSTM Model in Traffic Flow Prediction
在线阅读 下载PDF
导出
摘要 针对现有方法在捕捉交通流量复杂时空特征中的不足,提出了一种改进的Transformer长短期记忆(long short-term memory,LSTM)融合模型,旨在提升高速公路交通流量的预测精度。结合LSTM的长期依赖建模能力与Transformer的全局自注意力机制,通过门控残差网络筛选关键特征、动态位置编码增强时序感知,并引入掩码机制优化多头注意力计算,有效降低模型复杂度。基于长深高速公路监控数据,对比HA、ARIMA、LSTM、GRN及传统Transformer、GRN-Transformer等基线模型,实验结果表明,在6、12、24步长预测任务中,所提模型平均绝对误差(mean absolute error,MAE)与均方根误差(root mean square error,RMSE)均显著优于其他模型。其中,24步长预测的MAE较Transformer模型降低7.7%,RMSE较GRN-Transformer模型降低7.2%,验证了模型在捕捉长期时空依赖与动态特征上的优势。所提模型为智能交通系统提供了高精度的流量预测解决方案,可助力实时交通管理与决策优化。 To address the limitations of the existing methods in capturing the complex spatiotemporal features of traffic flow,this study proposes an improved Transformer long short-term memory(LSTM)fusion model aimed at enhancing the prediction accuracy of highway traffic flow.By combining the longterm dependency modeling capability of the LSTM with the global self-attention mechanism of the Transformer,this model utilizes a gated residual network to filter key features,dynamic positional encoding to enhance temporal perception,and a masking mechanism to optimize multi-head attention computation,thereby effectively reducing the model complexity.Based on monitoring data from the Changshen Expressway,the comparative experiments were conducted against baseline models including HA,ARIMA,LSTM,GRN,traditional Transformer,and GRN-Transformer.The results show that,across the 6-step,12-step,and 24-step prediction tasks,the proposed model achieves significantly lower mean absolute error(MAE)and root mean square error(RMSE)than all the other models.Specifically,for the 24-step prediction,the MAE is reduced by 7.7%compared with the Transformer model,and the RMSE is decreased by 7.2%relative to the GRN-Transformer model,validating the model's superiority in capturing long-term spatiotemporal dependencies and dynamic features.The proposed approach provides a high-precision solution for traffic flow prediction in intelligent transportation systems and can support real-time traffic management and de-cision-making optimization.
作者 刘海悦 伍添龙 毛自森 LIU Haiyue;WU Tianlong;MAO Zisen(Army Engineering University of PLA,Nanjing 210007,China)
机构地区 陆军工程大学
出处 《陆军工程大学学报》 2025年第4期80-87,共8页 Journal of Army Engineering University of PLA
关键词 交通流预测 长短期记忆网络 TRANSFORMER 门控残差网络 位置编码 prediction of traffic flow long short-term memory(LSTM)network Transformer gated residual network positional encoding
  • 相关文献

参考文献5

二级参考文献22

共引文献79

同被引文献12

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部