摘要
This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate mathematical models are challenging to establish or where system equations remain unknown.The Long Short-Term Memory(LSTM)neural network is trained using time series acquired from the desynchronization system states,subsequently predicting the lag synchronization transition.In the experiments,we focus on the Lorenz system with time-varying delayed coupling,studying the effects of coupling coefficients and time delays on lag synchronization,respectively.The results indicate that with appropriate training,the machine learning model can adeptly predict the lag synchronization occurrence and transition.This study not only enhances our comprehension of complex network synchronization behaviors but also underscores the potential and practical applications of machine learning in exploring nonlinear dynamic systems.
本文对于难以建立精确数学模型的耦合混沌系统,基于机器学习方法提出一个完全由数据驱动的预测框架,研究在系统方程未知情况下耦合混沌系统的滞后同步现象,利用系统在未同步状态下的时间序列数据训练LSTM神经网络,进而预测滞后同步转迁.在实验中,特别关注时变时滞耦合Lorenz系统,分别研究其耦合系数和时滞对滞后同步的影响.结果表明,经过适当训练的机器学习模型能够有效地预测滞后同步的发生及其转变过程.该研究不仅拓展了对复杂网络同步行为的理解,还展示了机器学习在探索非线性动态系统中的潜力和应用价值.
出处
《数学理论与应用》
2025年第1期115-126,共12页
Mathematical Theory and Applications
基金
supported by the National Natural Science Foundation of China(No.52174184)。