Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including at...Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.展开更多
Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferom...Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.展开更多
Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the origin...Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.展开更多
Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,p...Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.展开更多
Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we prop...Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.展开更多
Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is us...Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is used.By using bi-temporal airborne laser scanning(ALS)data,SI can be determined using models estimated from SI observed on field plots(the direct approach)or from predicted top heights at two points in time(the height differential approach).Time series of ALS data may enhance SI determination compared to conventional methods used in operational forest inventory by providing more detailed information about the top height development.We used longitudinal data comprising spatially consistent field and ALS data collected from training plots in 1999,2010,and 2022 to determine SI using the direct and height differential approaches using all combinations of years and performed an external validation.We also evaluated the use of data assimilation.Values of root mean square error obtained from external validation were in the ranges of 16.3%–21.4%and 12.8%–20.6%of the mean fieldregistered SI for the direct approach and the height differential approach,respectively.There were no statistically significant effects of time series length or the number of points in time on the obtained accuracies.Data assimilation did not result in any substantial improvement in the obtained accuracies.Although a time series of ALS data did not yield greater accuracies compared to using only two points in time,a larger proportion of the study area could be used in ALS-based determination of SI when a time series was available.This was because areas that were unsuitable for SI determination between two points in time could be subject to SI determination based on data from another part of the time series.展开更多
To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting perfo...To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting performance of five sophisticated models:Random Forest(RF),XGBoost,Support Vector Regression(SVR),Long Short-Term Memory networks(LSTM),and ARIMA To give a thorough evaluation of the models’performance,measures including Mean Absolute Error(MAE),Root Mean Square Error(RMSE),and Mean Absolute Percentage Error(MAPE)are used.To guarantee dependable model implementation,preprocessing procedures are carried out,such as feature engineering and stationarity tests.Machine learning models outperform ARIMA in identifying complex patterns and long-term associations,but ARIMA does better with data that exhibits strong linear trends.These results provide important information about how well the model fits various forecasting scenarios,which helps develop data-driven carbon reduction programs.Predictive modeling should be incorporated into sustainable climate policy to encourage the adoption of low-carbon technologies and proactive decisionmaking.Achieving long-term environmental sustainability requires strengthening carbon trading systems,encouraging clean energy investments,and enacting stronger emission laws.In line with international climate goals,suggestions for lowering CO_(2)emissions include switching to renewable energy,increasing energy efficiency,and putting afforestation initiatives into action.展开更多
Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on t...Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.展开更多
Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples...Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples,leading to significant differences in load level detection conclusions for samples with different characteristics(trend,seasonality,cyclicality).Achieving automated,feature-adaptive,and quantifiable analysis methods remains a challenge.This paper proposes a Threshold Recognition-based Load Level Detection Algorithm(TRLLD),which effectively identifies different load level regions in samples of arbitrary size and distribution type based on sample characteristics.By utilizing distribution density uniformity,the algorithm classifies data points and ultimately obtains normalized load values.In the feature recognition step,the algorithm employs the Density Uniformity Index Based on Differences(DUID),High Load Level Concentration(HLLC),and Low Load Level Concentration(LLLC)to assess sample characteristics,which are independent of specific load values,providing a standardized perspective on features,ensuring high efficiency and strong interpretability.Compared to traditional methods,the proposed approach demonstrates better adaptive and real-time analysis capabilities.Experimental results indicate that it can effectively identify high load and low load regions in 16 groups of time series samples with different load characteristics,yielding highly interpretable results.The correlation between the DUID and sample density distribution uniformity reaches 98.08%.When introducing 10% MAD intensity noise,the maximum relative error is 4.72%,showcasing high robustness.Notably,it exhibits significant advantages in general and low sample scenarios.展开更多
Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated...Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.展开更多
Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-spe...Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.展开更多
Accurate forecasting of oil production is essential for optimizing resource management and minimizing operational risks in the energy sector. Traditional time-series forecasting techniques, despite their widespread ap...Accurate forecasting of oil production is essential for optimizing resource management and minimizing operational risks in the energy sector. Traditional time-series forecasting techniques, despite their widespread application, often encounter difficulties in handling the complexities of oil production data, which is characterized by non-linear patterns, skewed distributions, and the presence of outliers. To overcome these limitations, deep learning methods have emerged as more robust alternatives. However, while deep neural networks offer improved accuracy, they demand substantial amounts of data for effective training. Conversely, shallow networks with fewer layers lack the capacity to model complex data distributions adequately. To address these challenges, this study introduces a novel hybrid model called Transfer LSTM to GRU (TLTG), which combines the strengths of deep and shallow networks using transfer learning. The TLTG model integrates Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRU) to enhance predictive accuracy while maintaining computational efficiency. Gaussian transformation is applied to the input data to reduce outliers and skewness, creating a more normal-like distribution. The proposed approach is validated on datasets from various wells in the Tahe oil field, China. Experimental results highlight the superior performance of the TLTG model, achieving 100% accuracy and faster prediction times (200 s) compared to eight other approaches, demonstrating its effectiveness and efficiency.展开更多
Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies ...Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies prevalent in real-world temporal data.This comprehensive survey reviews state-of-the-art DL architectures forTSF,focusing on four core paradigms:(1)ConvolutionalNeuralNetworks(CNNs),adept at extracting localized temporal features;(2)Recurrent Neural Networks(RNNs)and their advanced variants(LSTM,GRU),designed for sequential dependency modeling;(3)Graph Neural Networks(GNNs),specialized for forecasting structured relational data with spatial-temporal dependencies;and(4)Transformer-based models,leveraging self-attention mechanisms to capture global temporal patterns efficiently.We provide a rigorous analysis of the theoretical underpinnings,recent algorithmic advancements(e.g.,TCNs,attention mechanisms,hybrid architectures),and practical applications of each framework,supported by extensive benchmark datasets(e.g.,ETT,traffic flow,financial indicators)and standardized evaluation metrics(MAE,MSE,RMSE).Critical challenges,including handling irregular sampling intervals,integrating domain knowledge for robustness,and managing computational complexity,are thoroughly discussed.Emerging research directions highlighted include diffusion models for uncertainty quantification,hybrid pipelines combining classical statistical and DL techniques for enhanced interpretability,quantile regression with Transformers for riskaware forecasting,and optimizations for real-time deployment.This work serves as an essential reference,consolidating methodological innovations,empirical resources,and future trends to bridge the gap between theoretical research and practical implementation needs for researchers and practitioners in the field.展开更多
As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheles...As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.展开更多
GNSS time series analysis provides an effective method for research on the earth's surface deformation,and it can be divided into two parts,deterministic models and stochastic models.The former part can be achieve...GNSS time series analysis provides an effective method for research on the earth's surface deformation,and it can be divided into two parts,deterministic models and stochastic models.The former part can be achieved by several parameters,such as polynomial terms,periodic terms,offsets,and post-seismic models.The latter contains some stochastic noises,which can be affected by detecting the former parameters.If there are not enough parameters assumed,modeling errors will occur and adversely affect the analysis results.In this study,we propose a processing strategy in which the commonly-used 1-order of the polynomial term can be replaced with different orders for better fitting GNSS time series of the Crustal Movement Network of China(CMONOC)stations.Initially,we use the Bayesian Information Criterion(BIC)to identify the best order within the range of 1-4 during the fitting process using the white noise plus power-law noise(WN+PL)model.Then,we compare the 1-order and the optimal order on the effect of deterministic models in GNSS time series,including the velocity and its uncertainty,amplitudes,and initial phases of the annual signals.The results indicate that the first-order polynomial in the GNSS time series is not the primary factor.The root mean square(RMS)reduction rates of almost all station components are positive,which means the new fitting of optimal-order polynomial helps to reduce the RMS of residual series.Most stations maintain the velocity difference(VD)within ±1 mm/yr,with percentages of 85.6%,81.9%and 63.4%in the North,East,and Up components,respectively.As for annual signals,the numbers of amplitude difference(AD)remained at ±0.2 mm are 242,239,and 200 in three components,accounting for 99.6%,98.4%,and 82.3%,respectively.This finding reminds us that the detection of the optimal-order polynomial is necessary when we aim to acquire an accurate understanding of the crustal movement features.展开更多
In high-risk industrial environments like nuclear power plants,precise defect identification and localization are essential for maintaining production stability and safety.However,the complexity of such a harsh enviro...In high-risk industrial environments like nuclear power plants,precise defect identification and localization are essential for maintaining production stability and safety.However,the complexity of such a harsh environment leads to significant variations in the shape and size of the defects.To address this challenge,we propose the multivariate time series segmentation network(MSSN),which adopts a multiscale convolutional network with multi-stage and depth-separable convolutions for efficient feature extraction through variable-length templates.To tackle the classification difficulty caused by structural signal variance,MSSN employs logarithmic normalization to adjust instance distributions.Furthermore,it integrates classification with smoothing loss functions to accurately identify defect segments amid similar structural and defect signal subsequences.Our algorithm evaluated on both the Mackey-Glass dataset and industrial dataset achieves over 95%localization and demonstrates the capture capability on the synthetic dataset.In a nuclear plant's heat transfer tube dataset,it captures 90%of defect instances with75%middle localization F1 score.展开更多
Extracting typical operational scenarios is essential for making flexible decisions in the dispatch of a new power system.A novel deep time series aggregation scheme(DTSAs)is proposed to generate typical operational s...Extracting typical operational scenarios is essential for making flexible decisions in the dispatch of a new power system.A novel deep time series aggregation scheme(DTSAs)is proposed to generate typical operational scenarios,considering the large amount of historical operational snapshot data.Specifically,DTSAs analyse the intrinsic mechanisms of different scheduling operational scenario switching to mathematically represent typical operational scenarios.A Gramian angular summation field-based operational scenario image encoder was designed to convert operational scenario sequences into highdimensional spaces.This enables DTSAs to fully capture the spatiotemporal characteristics of new power systems using deep feature iterative aggregation models.The encoder also facilitates the generation of typical operational scenarios that conform to historical data distributions while ensuring the integrity of grid operational snapshots.Case studies demonstrate that the proposed method extracted new fine-grained power system dispatch schemes and outperformed the latest high-dimensional feature-screening methods.In addition,experiments with different new energy access ratios were conducted to verify the robustness of the proposed method.DTSAs enable dispatchers to master the operation experience of the power system in advance,and actively respond to the dynamic changes of the operation scenarios under the high access rate of new energy.展开更多
Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based...Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based on different normality assumptions performwell in specific scenarios,they often neglected the overall normality issue.Some feature extraction methods incorporate pre-training processes but they may not be suitable for time series anomaly detection,leading to decreased performance.Additionally,real-world time series samples are rarely free from noise,making them susceptible to outliers,which further impacts detection accuracy.To address these challenges,we propose a novel anomaly detection method called Robust One-Class Classification Detection(ROC).This approach utilizes an autoencoder(AE)to learn features while constraining the context vectors fromthe AE within a sufficiently small hypersphere,akin to One-Class Classification(OC)methods.By simultaneously optimizing two hypothetical objective functions,ROC captures various aspects of normality.We categorize the input raw time series into clean and outlier sequences,reducing the impact of outliers on compressed feature representation.Experimental results on public datasets indicate that our approach outperforms existing baselinemethods and substantially improves model robustness.展开更多
The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiologi...The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiological measurements,including electroencephalograph(EEG)and functional magnetic resonance imaging(fMRI),and are crucial for insights into physiological phenomena.This study introduces a novel method,the baseline perspective visibility graph(BPVG),which can analyze time series by accurately capturing connectivity across data points both above and below the baseline.We present the BPVG construction process and validate its performance using simulated signals.Results demonstrate that BPVG accurately translates periodic,random,and fractal signals into regular,random,and scale-free networks respectively,exhibiting diverse degree distribution traits.Furthermore,we apply BPVG to classify Alzheimer’s disease(AD)patients from healthy controls using EEG data and identify non-demented adults at varying dementia risk using resting-state fMRI(rs-fMRI)data.Utilizing degree distribution entropy derived from BPVG networks,our results exceed the best accuracy benchmark(77.01%)in EEG analysis,especially at channels F4(78.46%)and O1(81.54%).Additionally,our rs-fMRI analysis achieves a statistically significant classification accuracy of 76.74%.These findings highlight the effectiveness of BPVG in distinguishing various time series types and its practical utility in EEG and rs-fMRI analysis for early AD detection and dementia risk assessment.In conclusion,BPVG’s validation across both simulated and real data confirms its capability to capture comprehensive information from time series,irrespective of baseline constraints,providing a novel method for studying neural physiological signals.展开更多
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
基金supported by the Basic Science Center Project of the National Natural Science Foundation of China(42388102)the National Natural Science Foundation of China(42174030)+2 种基金the Special Fund of Hubei Luojia Laboratory(220100020)the Major Science and Technology Program for Hubei Province(2022AAA002)the Fundamental Research Funds for the Central Universities of China(2042022dx0001 and 2042023kfyq01)。
文摘Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.
基金jointly supported by the International Research Center of Big Data for Sustainable Development Goals(Grant No.CBAS2022GSP02)the National Natural Science Foundation of China(Grant Nos.42072320 and 42372264).
文摘Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.
基金supported in part by the Interdisciplinary Project of Dalian University(DLUXK-2023-ZD-001).
文摘Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.
基金support from the Fundamental Research Funds for Central Public Welfare Research Institutes(SK202324)the Central Guidance on Local Science and Technology Development Fund of Hebei Province(236Z0104G)+1 种基金the National Natural Science Foundation of China(62476078)the Geological Survey Project of China Geological Survey(G202304-2).
文摘Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.
文摘Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.
基金part of the Centre for Research-based Innovation SmartForest:Bringing Industry 4.0 to the Norwegian forest sector(NFR SFI project no.309671,smartforest.no)。
文摘Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is used.By using bi-temporal airborne laser scanning(ALS)data,SI can be determined using models estimated from SI observed on field plots(the direct approach)or from predicted top heights at two points in time(the height differential approach).Time series of ALS data may enhance SI determination compared to conventional methods used in operational forest inventory by providing more detailed information about the top height development.We used longitudinal data comprising spatially consistent field and ALS data collected from training plots in 1999,2010,and 2022 to determine SI using the direct and height differential approaches using all combinations of years and performed an external validation.We also evaluated the use of data assimilation.Values of root mean square error obtained from external validation were in the ranges of 16.3%–21.4%and 12.8%–20.6%of the mean fieldregistered SI for the direct approach and the height differential approach,respectively.There were no statistically significant effects of time series length or the number of points in time on the obtained accuracies.Data assimilation did not result in any substantial improvement in the obtained accuracies.Although a time series of ALS data did not yield greater accuracies compared to using only two points in time,a larger proportion of the study area could be used in ALS-based determination of SI when a time series was available.This was because areas that were unsuitable for SI determination between two points in time could be subject to SI determination based on data from another part of the time series.
文摘To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting performance of five sophisticated models:Random Forest(RF),XGBoost,Support Vector Regression(SVR),Long Short-Term Memory networks(LSTM),and ARIMA To give a thorough evaluation of the models’performance,measures including Mean Absolute Error(MAE),Root Mean Square Error(RMSE),and Mean Absolute Percentage Error(MAPE)are used.To guarantee dependable model implementation,preprocessing procedures are carried out,such as feature engineering and stationarity tests.Machine learning models outperform ARIMA in identifying complex patterns and long-term associations,but ARIMA does better with data that exhibits strong linear trends.These results provide important information about how well the model fits various forecasting scenarios,which helps develop data-driven carbon reduction programs.Predictive modeling should be incorporated into sustainable climate policy to encourage the adoption of low-carbon technologies and proactive decisionmaking.Achieving long-term environmental sustainability requires strengthening carbon trading systems,encouraging clean energy investments,and enacting stronger emission laws.In line with international climate goals,suggestions for lowering CO_(2)emissions include switching to renewable energy,increasing energy efficiency,and putting afforestation initiatives into action.
基金supported by National Science and Technology Major Project(2022ZD0115701)Nanfan Special Project,CAAS(YBXM2305,YBXM2401,YBXM2402,PTXM2402)+1 种基金National Natural Science Foundation of China(42071426,42301427)the Agricultural Science and Technology Innovation Program of the Chinese Academy of Agricultural Sciences。
文摘Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.
文摘Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples,leading to significant differences in load level detection conclusions for samples with different characteristics(trend,seasonality,cyclicality).Achieving automated,feature-adaptive,and quantifiable analysis methods remains a challenge.This paper proposes a Threshold Recognition-based Load Level Detection Algorithm(TRLLD),which effectively identifies different load level regions in samples of arbitrary size and distribution type based on sample characteristics.By utilizing distribution density uniformity,the algorithm classifies data points and ultimately obtains normalized load values.In the feature recognition step,the algorithm employs the Density Uniformity Index Based on Differences(DUID),High Load Level Concentration(HLLC),and Low Load Level Concentration(LLLC)to assess sample characteristics,which are independent of specific load values,providing a standardized perspective on features,ensuring high efficiency and strong interpretability.Compared to traditional methods,the proposed approach demonstrates better adaptive and real-time analysis capabilities.Experimental results indicate that it can effectively identify high load and low load regions in 16 groups of time series samples with different load characteristics,yielding highly interpretable results.The correlation between the DUID and sample density distribution uniformity reaches 98.08%.When introducing 10% MAD intensity noise,the maximum relative error is 4.72%,showcasing high robustness.Notably,it exhibits significant advantages in general and low sample scenarios.
基金funded by the Nursing project,“Clinical ability improvement project”in the First Affliated Hospital with Nanjing Medical University(JSPH-NC-2021-09).
文摘Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.
基金Funded by the Spanish Government and FEDER funds(AEI/FEDER,UE)under grant PID2021-124502OB-C42(PRESECREL)the predoctoral program“Concepción Arenal del Programa de Personal Investigador en formación Predoctoral”funded by Universidad de Cantabria and Cantabria’s Government(BOC 18-10-2021).
文摘Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.
文摘Accurate forecasting of oil production is essential for optimizing resource management and minimizing operational risks in the energy sector. Traditional time-series forecasting techniques, despite their widespread application, often encounter difficulties in handling the complexities of oil production data, which is characterized by non-linear patterns, skewed distributions, and the presence of outliers. To overcome these limitations, deep learning methods have emerged as more robust alternatives. However, while deep neural networks offer improved accuracy, they demand substantial amounts of data for effective training. Conversely, shallow networks with fewer layers lack the capacity to model complex data distributions adequately. To address these challenges, this study introduces a novel hybrid model called Transfer LSTM to GRU (TLTG), which combines the strengths of deep and shallow networks using transfer learning. The TLTG model integrates Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRU) to enhance predictive accuracy while maintaining computational efficiency. Gaussian transformation is applied to the input data to reduce outliers and skewness, creating a more normal-like distribution. The proposed approach is validated on datasets from various wells in the Tahe oil field, China. Experimental results highlight the superior performance of the TLTG model, achieving 100% accuracy and faster prediction times (200 s) compared to eight other approaches, demonstrating its effectiveness and efficiency.
基金funded by Natural Science Foundation of Heilongjiang Province,grant number LH2023F020.
文摘Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies prevalent in real-world temporal data.This comprehensive survey reviews state-of-the-art DL architectures forTSF,focusing on four core paradigms:(1)ConvolutionalNeuralNetworks(CNNs),adept at extracting localized temporal features;(2)Recurrent Neural Networks(RNNs)and their advanced variants(LSTM,GRU),designed for sequential dependency modeling;(3)Graph Neural Networks(GNNs),specialized for forecasting structured relational data with spatial-temporal dependencies;and(4)Transformer-based models,leveraging self-attention mechanisms to capture global temporal patterns efficiently.We provide a rigorous analysis of the theoretical underpinnings,recent algorithmic advancements(e.g.,TCNs,attention mechanisms,hybrid architectures),and practical applications of each framework,supported by extensive benchmark datasets(e.g.,ETT,traffic flow,financial indicators)and standardized evaluation metrics(MAE,MSE,RMSE).Critical challenges,including handling irregular sampling intervals,integrating domain knowledge for robustness,and managing computational complexity,are thoroughly discussed.Emerging research directions highlighted include diffusion models for uncertainty quantification,hybrid pipelines combining classical statistical and DL techniques for enhanced interpretability,quantile regression with Transformers for riskaware forecasting,and optimizations for real-time deployment.This work serves as an essential reference,consolidating methodological innovations,empirical resources,and future trends to bridge the gap between theoretical research and practical implementation needs for researchers and practitioners in the field.
基金supported in part by the National Natural Science Foundation of China(62176109,62476115)the Fundamental Research Funds for the Central Universities(lzujbky-2023-ey07,lzujbky-2023-it14)+1 种基金the Natural Science Foundation of Gansu Province(24JRRA488)the Supercomputing Center of Lanzhou University
文摘As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.
基金supported by the National Natural Science Foundation of China(Grant Nos.42404017,42122025 and 42174030).
文摘GNSS time series analysis provides an effective method for research on the earth's surface deformation,and it can be divided into two parts,deterministic models and stochastic models.The former part can be achieved by several parameters,such as polynomial terms,periodic terms,offsets,and post-seismic models.The latter contains some stochastic noises,which can be affected by detecting the former parameters.If there are not enough parameters assumed,modeling errors will occur and adversely affect the analysis results.In this study,we propose a processing strategy in which the commonly-used 1-order of the polynomial term can be replaced with different orders for better fitting GNSS time series of the Crustal Movement Network of China(CMONOC)stations.Initially,we use the Bayesian Information Criterion(BIC)to identify the best order within the range of 1-4 during the fitting process using the white noise plus power-law noise(WN+PL)model.Then,we compare the 1-order and the optimal order on the effect of deterministic models in GNSS time series,including the velocity and its uncertainty,amplitudes,and initial phases of the annual signals.The results indicate that the first-order polynomial in the GNSS time series is not the primary factor.The root mean square(RMS)reduction rates of almost all station components are positive,which means the new fitting of optimal-order polynomial helps to reduce the RMS of residual series.Most stations maintain the velocity difference(VD)within ±1 mm/yr,with percentages of 85.6%,81.9%and 63.4%in the North,East,and Up components,respectively.As for annual signals,the numbers of amplitude difference(AD)remained at ±0.2 mm are 242,239,and 200 in three components,accounting for 99.6%,98.4%,and 82.3%,respectively.This finding reminds us that the detection of the optimal-order polynomial is necessary when we aim to acquire an accurate understanding of the crustal movement features.
基金supported by the National Science and Technology Major Project of the Ministry of Science and Technology of China(2024ZD0608100)the National Natural Science Foundation of China(62332017,U22A2022)
文摘In high-risk industrial environments like nuclear power plants,precise defect identification and localization are essential for maintaining production stability and safety.However,the complexity of such a harsh environment leads to significant variations in the shape and size of the defects.To address this challenge,we propose the multivariate time series segmentation network(MSSN),which adopts a multiscale convolutional network with multi-stage and depth-separable convolutions for efficient feature extraction through variable-length templates.To tackle the classification difficulty caused by structural signal variance,MSSN employs logarithmic normalization to adjust instance distributions.Furthermore,it integrates classification with smoothing loss functions to accurately identify defect segments amid similar structural and defect signal subsequences.Our algorithm evaluated on both the Mackey-Glass dataset and industrial dataset achieves over 95%localization and demonstrates the capture capability on the synthetic dataset.In a nuclear plant's heat transfer tube dataset,it captures 90%of defect instances with75%middle localization F1 score.
基金The Key R&D Project of Jilin Province,Grant/Award Number:20230201067GX。
文摘Extracting typical operational scenarios is essential for making flexible decisions in the dispatch of a new power system.A novel deep time series aggregation scheme(DTSAs)is proposed to generate typical operational scenarios,considering the large amount of historical operational snapshot data.Specifically,DTSAs analyse the intrinsic mechanisms of different scheduling operational scenario switching to mathematically represent typical operational scenarios.A Gramian angular summation field-based operational scenario image encoder was designed to convert operational scenario sequences into highdimensional spaces.This enables DTSAs to fully capture the spatiotemporal characteristics of new power systems using deep feature iterative aggregation models.The encoder also facilitates the generation of typical operational scenarios that conform to historical data distributions while ensuring the integrity of grid operational snapshots.Case studies demonstrate that the proposed method extracted new fine-grained power system dispatch schemes and outperformed the latest high-dimensional feature-screening methods.In addition,experiments with different new energy access ratios were conducted to verify the robustness of the proposed method.DTSAs enable dispatchers to master the operation experience of the power system in advance,and actively respond to the dynamic changes of the operation scenarios under the high access rate of new energy.
基金supported by the National Natural Science Foundation(62202118)Guizhou Province Major Project(Qiankehe Major Project[2024]014)+3 种基金Science and Scientific and Technological Research Projects from Guizhou Education Department(Qianiao ji[2023]003)Hundred-level Innovative Talent Project of Guizhou Provincial Science and Technology Department(Qiankehe Platform Talent-GCC[2023]018)Guizhou Province Major Project(Qiankehe Major Project[2024]003)Foundation of Chongqing Key Laboratory of Public Big Data Security Technology(CQKL-QJ202300001).
文摘Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based on different normality assumptions performwell in specific scenarios,they often neglected the overall normality issue.Some feature extraction methods incorporate pre-training processes but they may not be suitable for time series anomaly detection,leading to decreased performance.Additionally,real-world time series samples are rarely free from noise,making them susceptible to outliers,which further impacts detection accuracy.To address these challenges,we propose a novel anomaly detection method called Robust One-Class Classification Detection(ROC).This approach utilizes an autoencoder(AE)to learn features while constraining the context vectors fromthe AE within a sufficiently small hypersphere,akin to One-Class Classification(OC)methods.By simultaneously optimizing two hypothetical objective functions,ROC captures various aspects of normality.We categorize the input raw time series into clean and outlier sequences,reducing the impact of outliers on compressed feature representation.Experimental results on public datasets indicate that our approach outperforms existing baselinemethods and substantially improves model robustness.
基金supported by the National Key Research and Development Program of China(Grant No.2023YFF1204803)the Natural Science Foundation of Jiangsu Province,China(Grant No.BK20190736)+1 种基金the Fundamental Research Funds for the Central Universities(Grant No.NJ2024029)the National Natural Science Foundation of China(Grant Nos.81701346 and 62201265).
文摘The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiological measurements,including electroencephalograph(EEG)and functional magnetic resonance imaging(fMRI),and are crucial for insights into physiological phenomena.This study introduces a novel method,the baseline perspective visibility graph(BPVG),which can analyze time series by accurately capturing connectivity across data points both above and below the baseline.We present the BPVG construction process and validate its performance using simulated signals.Results demonstrate that BPVG accurately translates periodic,random,and fractal signals into regular,random,and scale-free networks respectively,exhibiting diverse degree distribution traits.Furthermore,we apply BPVG to classify Alzheimer’s disease(AD)patients from healthy controls using EEG data and identify non-demented adults at varying dementia risk using resting-state fMRI(rs-fMRI)data.Utilizing degree distribution entropy derived from BPVG networks,our results exceed the best accuracy benchmark(77.01%)in EEG analysis,especially at channels F4(78.46%)and O1(81.54%).Additionally,our rs-fMRI analysis achieves a statistically significant classification accuracy of 76.74%.These findings highlight the effectiveness of BPVG in distinguishing various time series types and its practical utility in EEG and rs-fMRI analysis for early AD detection and dementia risk assessment.In conclusion,BPVG’s validation across both simulated and real data confirms its capability to capture comprehensive information from time series,irrespective of baseline constraints,providing a novel method for studying neural physiological signals.