Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR de...Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR deformation data from the Beishan exploration tunnel(BET)test platform,the metaheuristic algorithm crested porcupine optimizer(CPO)was applied for the first time to optimize the time series of TSR deformation,and an integrated model incorporating convolutional neural network(CNN),long short-term memory network(LSTM),and attention mechanism(ATT)was proposed.This model integrates the strong feature extraction capabilities of CNN,the superior sequence prediction performance of LSTM,and the effective attention mechanism of ATT.The results show that during blasting excavation,the internal displacement of TSR exhibits a stepwise change pattern.After excavation,the internal displacement enters a phase of gradual increase,ultimately reaching a stable convergence stage.The CPO-CNN-LSTM-ATT(CPO-CLA)integrated model demonstrated excellent predictive accuracy and stability across various evaluation metrics,achieving a determination coefficient(R^(2))of 0.985.Compared to the CNN-LSTM-ATT(CLA)model,the CPO-CLA model showed a 14.1%increase in R^(2),a 61.5%decrease in root mean square error(RMSE),and a 72.9%decrease in mean absolute error(MAE).In comparison with current mainstream metaheuristic integrated models,the CPO-CLA model is better suited for predicting long-term TSR deformation.It offers high computational efficiency,accurate predictions,and expertise in optimizing large datasets.展开更多
As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheles...As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.展开更多
Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been develop...Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.展开更多
Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stab...Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.展开更多
Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel i...Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction.展开更多
Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimizatio...Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.展开更多
Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties...Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision.展开更多
Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep lear...Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions.展开更多
The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of paralle...The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of parallel multidimensional step search (PMSS) is proposed for users to select best parameters intraining support vector machine to get a prediction model. A series of tests are performed to evaluate themodeling mechanism and prediction results indicate that Nu-SVR models can reflect the variation tendencyof time series with low prediction error on both familiar and unfamiliar data. Statistical analysis is alsoemployed to verify the optimization performance of PMSS algorithm and comparative results indicate thattraining error can take the minimum over the interval around planar data point corresponding to selectedparameters. Moreover, the introduction of parallelization can remarkably speed up the optimizing procedure.展开更多
In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and time...In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used as indicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%,the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.展开更多
Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive cal...Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive calculation of block matrix, a new time series online prediction algorithm based on improved LS-SVM was proposed. The historical training results were fully utilized and the computing speed of LS-SVM was enhanced. Then, the improved algorithm was applied to timc series online prediction. Based on the operational data provided by the Northwest Power Grid of China, the method was used in the transient stability prediction of electric power system. The results show that, compared with the calculation time of the traditional LS-SVM(75 1 600 ms), that of the proposed method in different time windows is 40-60 ms, proposed method is above 0.8. So the improved method is online prediction. and the prediction accuracy(normalized root mean squared error) of the better than the traditional LS-SVM and more suitable for time series online prediction.展开更多
This study presents a time series prediction model with output self feedback which is implemented based on online sequential extreme learning machine. The output variables derived from multilayer perception can feedba...This study presents a time series prediction model with output self feedback which is implemented based on online sequential extreme learning machine. The output variables derived from multilayer perception can feedback to the network input layer to create a temporal relation between the current node inputs and the lagged node outputs while overcoming the limitation of memory which is a vital port for any time-series prediction application. The model can overcome the static prediction problem with most time series prediction models and can effectively cope with the dynamic properties of time series data. A linear and a nonlinear forecasting algorithms based on online extreme learning machine are proposed to implement the output feedback forecasting model. They are both recursive estimator and have two distinct phases: Predict and Update. The proposed model was tested against different kinds of time series data and the results indicate that the model outperforms the original static model without feedback.展开更多
For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself ...For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm.展开更多
The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time se...The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable.展开更多
Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge ...Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively.展开更多
The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries an...The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.展开更多
The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural...The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly.展开更多
To anticipate the fluctuations in per capita disposable income among Hubei Province inhabitants for the subsequent biennium,a dataset spanning from 2005 to 2022 was culled.Employed in this study were three distinct ti...To anticipate the fluctuations in per capita disposable income among Hubei Province inhabitants for the subsequent biennium,a dataset spanning from 2005 to 2022 was culled.Employed in this study were three distinct time series prognostication methodologies:Exponential Smoothing(Holt-Winter),Autoregressive Moving Average(ARMA),and Autoregressive Integrated Moving Average(ARIMA).These techniques were applied to envision the forthcoming trajectory of per capita disposable income for the province's residents.By computing diverse metrics to assess predictive discrepancies—like the Mean Absolute Error(MAE)and Root Mean Square Error(RMSE)—the effectiveness of the assorted models was gauged,culminating in the selection of the ARIMA model due to its superior performance.Capitalizing on this,approximations for per capita disposable income during 2023 and 2024 were extrapolated.The resultant prognoses project a sustained and noteworthy uptick in per capita disposable income for urban denizens of Hubei Province in the forthcoming biennial span.Ultimately,the findings were translated into actionable policy suggestions and deductions,rendering them highly pertinent for the dissection of Hubei Province's economic evolution.展开更多
A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in stat...A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically.展开更多
The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such...The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy.展开更多
基金supported by the China Atomic Energy Authority(CAEA)for China’s URL Development Program and the Geological Disposal Program(Grant No.FZ2105)the National Natural Science Foundation of China(Grant No.52278420).
文摘Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR deformation data from the Beishan exploration tunnel(BET)test platform,the metaheuristic algorithm crested porcupine optimizer(CPO)was applied for the first time to optimize the time series of TSR deformation,and an integrated model incorporating convolutional neural network(CNN),long short-term memory network(LSTM),and attention mechanism(ATT)was proposed.This model integrates the strong feature extraction capabilities of CNN,the superior sequence prediction performance of LSTM,and the effective attention mechanism of ATT.The results show that during blasting excavation,the internal displacement of TSR exhibits a stepwise change pattern.After excavation,the internal displacement enters a phase of gradual increase,ultimately reaching a stable convergence stage.The CPO-CNN-LSTM-ATT(CPO-CLA)integrated model demonstrated excellent predictive accuracy and stability across various evaluation metrics,achieving a determination coefficient(R^(2))of 0.985.Compared to the CNN-LSTM-ATT(CLA)model,the CPO-CLA model showed a 14.1%increase in R^(2),a 61.5%decrease in root mean square error(RMSE),and a 72.9%decrease in mean absolute error(MAE).In comparison with current mainstream metaheuristic integrated models,the CPO-CLA model is better suited for predicting long-term TSR deformation.It offers high computational efficiency,accurate predictions,and expertise in optimizing large datasets.
基金supported in part by the National Natural Science Foundation of China(62176109,62476115)the Fundamental Research Funds for the Central Universities(lzujbky-2023-ey07,lzujbky-2023-it14)+1 种基金the Natural Science Foundation of Gansu Province(24JRRA488)the Supercomputing Center of Lanzhou University
文摘As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.
文摘Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.
基金supported by the National Natural Science Foundation of China(Grant No.52308340)the Innovative Projects of Universities in Guangdong(Grant No.2022KTSCX208)Sichuan Transportation Science and Technology Project(Grant No.2018-ZL-01).
文摘Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.
文摘Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction.
基金The project supported by National Natural Science Foundation of China under Grant No. 90203008 and the Doctoral Foundation of the Ministry of Education of China
文摘Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.
基金supported by National Natural Science Foundation(NNSF)of China under Grant No.61371024Aviation Science Fund of China under Grant No.2013ZD53051+1 种基金Aerospace Technology Support Fund of Chinathe Industry-Academy-Research Project of AVIC(cxy2013XGD14)
文摘Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision.
基金funded by the Natural Science Foundation of Fujian Province,China (Grant No.2022J05291)Xiamen Scientific Research Funding for Overseas Chinese Scholars.
文摘Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions.
基金Supported by the National Natural Science Foundation of China (No. 60873235&60473099)the Science-Technology Development Key Project of Jilin Province of China (No. 20080318)the Program of New Century Excellent Talents in University of China (No. NCET-06-0300).
文摘The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of parallel multidimensional step search (PMSS) is proposed for users to select best parameters intraining support vector machine to get a prediction model. A series of tests are performed to evaluate themodeling mechanism and prediction results indicate that Nu-SVR models can reflect the variation tendencyof time series with low prediction error on both familiar and unfamiliar data. Statistical analysis is alsoemployed to verify the optimization performance of PMSS algorithm and comparative results indicate thattraining error can take the minimum over the interval around planar data point corresponding to selectedparameters. Moreover, the introduction of parallelization can remarkably speed up the optimizing procedure.
基金supported by the Research and Innovation Program for College and University Graduate Students in Jiangsu Province (No.CX10B-141Z)the National Natural Science Foundation of China (No. 41071273)
文摘In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used as indicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%,the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.
基金Project (SGKJ[200301-16]) supported by the State Grid Cooperation of China
文摘Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive calculation of block matrix, a new time series online prediction algorithm based on improved LS-SVM was proposed. The historical training results were fully utilized and the computing speed of LS-SVM was enhanced. Then, the improved algorithm was applied to timc series online prediction. Based on the operational data provided by the Northwest Power Grid of China, the method was used in the transient stability prediction of electric power system. The results show that, compared with the calculation time of the traditional LS-SVM(75 1 600 ms), that of the proposed method in different time windows is 40-60 ms, proposed method is above 0.8. So the improved method is online prediction. and the prediction accuracy(normalized root mean squared error) of the better than the traditional LS-SVM and more suitable for time series online prediction.
基金Foundation item: the National Natural Science Foundation of China (No. 61203337)
文摘This study presents a time series prediction model with output self feedback which is implemented based on online sequential extreme learning machine. The output variables derived from multilayer perception can feedback to the network input layer to create a temporal relation between the current node inputs and the lagged node outputs while overcoming the limitation of memory which is a vital port for any time-series prediction application. The model can overcome the static prediction problem with most time series prediction models and can effectively cope with the dynamic properties of time series data. A linear and a nonlinear forecasting algorithms based on online extreme learning machine are proposed to implement the output feedback forecasting model. They are both recursive estimator and have two distinct phases: Predict and Update. The proposed model was tested against different kinds of time series data and the results indicate that the model outperforms the original static model without feedback.
基金funded by Prince Sultan University,Riyadh,Saudi Arabia。
文摘For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm.
文摘The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.61772136,61672159)the Technology Innovation Platform Project of Fujian Province(2014H2005)+1 种基金the Research Project for Young and Middle-aged Teachers of Fujian Province(JT 180045)the Fujian Collaborative Innovation Center for Big Data Application in Governments,the Fujian Engineering Research Center of Big Data Analysis and Processing.
文摘Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively.
基金supported by the China Scholarship Council and the CERNET Innovation Project under grant No.20170111.
文摘The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.
文摘The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly.
文摘To anticipate the fluctuations in per capita disposable income among Hubei Province inhabitants for the subsequent biennium,a dataset spanning from 2005 to 2022 was culled.Employed in this study were three distinct time series prognostication methodologies:Exponential Smoothing(Holt-Winter),Autoregressive Moving Average(ARMA),and Autoregressive Integrated Moving Average(ARIMA).These techniques were applied to envision the forthcoming trajectory of per capita disposable income for the province's residents.By computing diverse metrics to assess predictive discrepancies—like the Mean Absolute Error(MAE)and Root Mean Square Error(RMSE)—the effectiveness of the assorted models was gauged,culminating in the selection of the ARIMA model due to its superior performance.Capitalizing on this,approximations for per capita disposable income during 2023 and 2024 were extrapolated.The resultant prognoses project a sustained and noteworthy uptick in per capita disposable income for urban denizens of Hubei Province in the forthcoming biennial span.Ultimately,the findings were translated into actionable policy suggestions and deductions,rendering them highly pertinent for the dissection of Hubei Province's economic evolution.
基金Project supported by the National Natural Science Foundation of China (Grant No. 61201452)
文摘A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically.
基金This work was supported by National Key Research and Development Program of China(2020YFB1506703)the National Natural Science Foundation of China(Grant No.62072018).
文摘The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy.