期刊文献+
共找到64篇文章
< 1 2 4 >
每页显示 20 50 100
Time series prediction of tunnel surrounding rock deformation using CPO-CLA integrated model
1
作者 Dengke Zhang Yang Han +4 位作者 Chuanle Wang Lei Gao Hui Lu Liang Chen Erbing Li 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第12期7915-7930,共16页
Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR de... Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR deformation data from the Beishan exploration tunnel(BET)test platform,the metaheuristic algorithm crested porcupine optimizer(CPO)was applied for the first time to optimize the time series of TSR deformation,and an integrated model incorporating convolutional neural network(CNN),long short-term memory network(LSTM),and attention mechanism(ATT)was proposed.This model integrates the strong feature extraction capabilities of CNN,the superior sequence prediction performance of LSTM,and the effective attention mechanism of ATT.The results show that during blasting excavation,the internal displacement of TSR exhibits a stepwise change pattern.After excavation,the internal displacement enters a phase of gradual increase,ultimately reaching a stable convergence stage.The CPO-CNN-LSTM-ATT(CPO-CLA)integrated model demonstrated excellent predictive accuracy and stability across various evaluation metrics,achieving a determination coefficient(R^(2))of 0.985.Compared to the CNN-LSTM-ATT(CLA)model,the CPO-CLA model showed a 14.1%increase in R^(2),a 61.5%decrease in root mean square error(RMSE),and a 72.9%decrease in mean absolute error(MAE).In comparison with current mainstream metaheuristic integrated models,the CPO-CLA model is better suited for predicting long-term TSR deformation.It offers high computational efficiency,accurate predictions,and expertise in optimizing large datasets. 展开更多
关键词 Blasting excavation time series prediction Neural network Metaheuristic optimization algorithm Surrounding rock deformation
在线阅读 下载PDF
A Correntropy-Based Echo State Network With Application to Time Series Prediction
2
作者 Xiufang Chen Zhenming Su +1 位作者 Long Jin Shuai Li 《IEEE/CAA Journal of Automatica Sinica》 2025年第2期425-435,共11页
As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheles... As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN. 展开更多
关键词 Correntropy echo state network(ESN) noise time series prediction
在线阅读 下载PDF
Application of uncertainty reasoning based on cloud model in time series prediction 被引量:11
3
作者 张锦春 胡谷雨 《Journal of Zhejiang University Science》 EI CSCD 2003年第5期578-583,共6页
Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been develop... Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets. 展开更多
关键词 time series prediction Cloud model Simple expo nential smoothing method
在线阅读 下载PDF
Time series prediction of reservoir bank landslide failure probability considering the spatial variability of soil properties 被引量:2
4
作者 Luqi Wang Lin Wang +3 位作者 Wengang Zhang Xuanyu Meng Songlin Liu Chun Zhu 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第10期3951-3960,共10页
Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stab... Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models. 展开更多
关键词 Machine learning(ML) Reservoir bank landslide Spatial variability time series prediction Failure probability
在线阅读 下载PDF
Chaotic time series prediction using fuzzy sigmoid kernel-based support vector machines 被引量:2
5
作者 刘涵 刘丁 邓凌峰 《Chinese Physics B》 SCIE EI CAS CSCD 2006年第6期1196-1200,共5页
Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel i... Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction. 展开更多
关键词 support vector machines chaotic time series prediction fuzzy sigmoid kernel
原文传递
Nonlinear Time Series Prediction Using LS-SVM with Chaotic Mutation Evolutionary Programming for Parameter Optimization 被引量:1
6
作者 XU Rui-Rui CHEN Tian-Lun GAO Cheng-Feng 《Communications in Theoretical Physics》 SCIE CAS CSCD 2006年第4期641-646,共6页
Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimizatio... Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained. 展开更多
关键词 nonlinear time series prediction least squares support vector machine chaotic mutation evolu tionary programming
在线阅读 下载PDF
LS-SVR and AGO Based Time Series Prediction Method 被引量:2
7
作者 ZHANG Shou-peng LIU Shan +2 位作者 CHAI Wang-xu ZHANG Jia-qi GUO Yang-ming 《International Journal of Plant Engineering and Management》 2016年第1期1-13,共13页
Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties... Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision. 展开更多
关键词 time series prediction least squares support vector regression (LS-SVR) Gaussian radial basisfunction (RBF) accumulated generating operation (AGO)
在线阅读 下载PDF
Deep Learning for Financial Time Series Prediction:A State-of-the-Art Review of Standalone and HybridModels
8
作者 Weisi Chen Walayat Hussain +1 位作者 Francesco Cauteruccio Xu Zhang 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期187-224,共38页
Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep lear... Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions. 展开更多
关键词 Financial time series prediction convolutional neural network long short-term memory deep learning attention mechanism FINANCE
在线阅读 下载PDF
Parameter selection in time series prediction based on nu-support vector regression
9
作者 胡亮 Che Xilong 《High Technology Letters》 EI CAS 2009年第4期337-342,共6页
The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of paralle... The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of parallel multidimensional step search (PMSS) is proposed for users to select best parameters intraining support vector machine to get a prediction model. A series of tests are performed to evaluate themodeling mechanism and prediction results indicate that Nu-SVR models can reflect the variation tendencyof time series with low prediction error on both familiar and unfamiliar data. Statistical analysis is alsoemployed to verify the optimization performance of PMSS algorithm and comparative results indicate thattraining error can take the minimum over the interval around planar data point corresponding to selectedparameters. Moreover, the introduction of parallelization can remarkably speed up the optimizing procedure. 展开更多
关键词 parameter selection time series prediction nu-support vector regression (Nu-SVR) parallel multidimensional step search (PMSS)
在线阅读 下载PDF
Time series prediction of mining subsidence based on a SVM 被引量:10
10
作者 Li Peixian Tan Zhixiang +1 位作者 Yan Lili Deng Kazhong 《Mining Science and Technology》 EI CAS 2011年第4期557-562,共6页
In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and time... In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used as indicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%,the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements. 展开更多
关键词 Support vector machine Mining subsidence time series Dynamic prediction
在线阅读 下载PDF
Time series online prediction algorithm based on least squares support vector machine 被引量:8
11
作者 吴琼 刘文颖 杨以涵 《Journal of Central South University of Technology》 EI 2007年第3期442-446,共5页
Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive cal... Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive calculation of block matrix, a new time series online prediction algorithm based on improved LS-SVM was proposed. The historical training results were fully utilized and the computing speed of LS-SVM was enhanced. Then, the improved algorithm was applied to timc series online prediction. Based on the operational data provided by the Northwest Power Grid of China, the method was used in the transient stability prediction of electric power system. The results show that, compared with the calculation time of the traditional LS-SVM(75 1 600 ms), that of the proposed method in different time windows is 40-60 ms, proposed method is above 0.8. So the improved method is online prediction. and the prediction accuracy(normalized root mean squared error) of the better than the traditional LS-SVM and more suitable for time series online prediction. 展开更多
关键词 time series prediction machine learning support vector machine statistical learning theory
在线阅读 下载PDF
Online Sequential Extreme Multilayer Perception with Time Series Learning Machine Based Output Self Feedback for Prediction 被引量:5
12
作者 PAN Feng ZHAO Hai-bo 《Journal of Shanghai Jiaotong university(Science)》 EI 2013年第3期366-375,共10页
This study presents a time series prediction model with output self feedback which is implemented based on online sequential extreme learning machine. The output variables derived from multilayer perception can feedba... This study presents a time series prediction model with output self feedback which is implemented based on online sequential extreme learning machine. The output variables derived from multilayer perception can feedback to the network input layer to create a temporal relation between the current node inputs and the lagged node outputs while overcoming the limitation of memory which is a vital port for any time-series prediction application. The model can overcome the static prediction problem with most time series prediction models and can effectively cope with the dynamic properties of time series data. A linear and a nonlinear forecasting algorithms based on online extreme learning machine are proposed to implement the output feedback forecasting model. They are both recursive estimator and have two distinct phases: Predict and Update. The proposed model was tested against different kinds of time series data and the results indicate that the model outperforms the original static model without feedback. 展开更多
关键词 time series prediction extreme learning machine (ELM) autoregression (AR) online sequential learning ELM (OS-ELM) recurrent neural network (RNN)
原文传递
Prediction of Time Series Empowered with a Novel SREKRLS Algorithm 被引量:3
13
作者 Bilal Shoaib Yasir Javed +6 位作者 Muhammad Adnan Khan Fahad Ahmad Rizwan Majeed Muhammad Saqib Nawaz Muhammad Adeel Ashraf Abid Iqbal Muhammad Idrees 《Computers, Materials & Continua》 SCIE EI 2021年第5期1413-1427,共15页
For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself ... For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm. 展开更多
关键词 Kernel methods square root adaptive filtering givens rotation mackey glass time series prediction recursive least squares kernel recursive least squares extended kernel recursive least squares square root extended kernel recursive least squares algorithm
在线阅读 下载PDF
STUDY ON THE PREDICTION METHOD OF LOW-DIMENSION TIME SERIES THAT ARISE FROM THE INTRINSIC NONLINEAR DYNAMICS 被引量:2
14
作者 MA Junhai(马军海) +1 位作者 CHEN Yushu(陈予恕) 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2001年第5期501-509,共9页
The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time se... The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable. 展开更多
关键词 NONLINEAR chaotic model parameter identification time series prediction
在线阅读 下载PDF
A framework based on sparse representation model for time series prediction in smart city 被引量:1
15
作者 Zhiyong YU Xiangping ZHENG +3 位作者 Fangwan HUANG Wenzhong GUO Lin SUN Zhiwen YU 《Frontiers of Computer Science》 SCIE EI CSCD 2021年第1期99-111,共13页
Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge ... Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively. 展开更多
关键词 sparse representation smart city time series prediction dictionary construction
原文传递
AFSTGCN:Prediction for multivariate time series using an adaptive fused spatial-temporal graph convolutional network
16
作者 Yuteng Xiao Kaijian Xia +5 位作者 Hongsheng Yin Yu-Dong Zhang Zhenjiang Qian Zhaoyang Liu Yuehan Liang Xiaodan Li 《Digital Communications and Networks》 SCIE CSCD 2024年第2期292-303,共12页
The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries an... The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models. 展开更多
关键词 Adaptive adjacency matrix Digital twin Graph convolutional network Multivariate time series prediction Spatial-temporal graph
在线阅读 下载PDF
STUDY ON PREDICTION METHODS FOR DYNAMIC SYSTEMS OF NONLINEAR CHAOTIC TIME SERIES*
17
作者 马军海 陈予恕 辛宝贵 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2004年第6期605-611,共7页
The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural... The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly. 展开更多
关键词 nonlinear self-related chaotic model wavelet neural network parameter identification time series prediction
在线阅读 下载PDF
Analysis and Prognostication of Residents'Per Capita Disposable Income in Hubei Province using Time Series Prediction Methods
18
作者 Lirui Teng 《Advances in Social Behavior Research》 2023年第1期46-53,共8页
To anticipate the fluctuations in per capita disposable income among Hubei Province inhabitants for the subsequent biennium,a dataset spanning from 2005 to 2022 was culled.Employed in this study were three distinct ti... To anticipate the fluctuations in per capita disposable income among Hubei Province inhabitants for the subsequent biennium,a dataset spanning from 2005 to 2022 was culled.Employed in this study were three distinct time series prognostication methodologies:Exponential Smoothing(Holt-Winter),Autoregressive Moving Average(ARMA),and Autoregressive Integrated Moving Average(ARIMA).These techniques were applied to envision the forthcoming trajectory of per capita disposable income for the province's residents.By computing diverse metrics to assess predictive discrepancies—like the Mean Absolute Error(MAE)and Root Mean Square Error(RMSE)—the effectiveness of the assorted models was gauged,culminating in the selection of the ARIMA model due to its superior performance.Capitalizing on this,approximations for per capita disposable income during 2023 and 2024 were extrapolated.The resultant prognoses project a sustained and noteworthy uptick in per capita disposable income for urban denizens of Hubei Province in the forthcoming biennial span.Ultimately,the findings were translated into actionable policy suggestions and deductions,rendering them highly pertinent for the dissection of Hubei Province's economic evolution. 展开更多
关键词 Per Capita Disposable Income time series prediction Exponential Smoothing ARMA ARIMA
在线阅读 下载PDF
New prediction of chaotic time series based on local Lyapunov exponent 被引量:9
19
作者 张勇 《Chinese Physics B》 SCIE EI CAS CSCD 2013年第5期191-197,共7页
A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in stat... A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically. 展开更多
关键词 chaotic time series prediction of chaotic time series local Lyapunov exponent least squaresmethod
原文传递
Adaptive watermark generation mechanism based on time series prediction for stream processing 被引量:1
20
作者 Yang SONG Yunchun LI +3 位作者 Hailong YANG Jun XU Zerong LUAN Wei LI 《Frontiers of Computer Science》 SCIE EI CSCD 2021年第6期59-73,共15页
The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such... The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy. 展开更多
关键词 data stream processing WATERMARK time series based prediction dynamic adjustment
原文传递
上一页 1 2 4 下一页 到第
使用帮助 返回顶部