Predicting surface settlement can identify potential risks associated in shield construction.However,in the construction of undercrossing existing structures,the surface settlement is minimal due to the high stiffness...Predicting surface settlement can identify potential risks associated in shield construction.However,in the construction of undercrossing existing structures,the surface settlement is minimal due to the high stiffness of the existing structure,making it unsuitable as a basis for risk assessment.Therefore,interlayer soil settlement was used as an evaluation index in this paper,which was predicted by the developed multi-parameter time series(MPTS)model.This model establishes new dataset,including time,effective stress ratio(ESR),mechanical fluctuation coefficient(MFC),and interlayer soil settlement,where ESR and MFC take into account the changing geological conditions.This study proposes a novel MPTS model,integrating grid search(GS),nonlinear particle swarm optimization(NPSO),and support vector regression(SVR)algorithms to predict interlayer soil settlement during under-crossing construction.It utilizes GS and NPSO to obtain the optimal hyperparameters for SVR.Sensitivity analysis based on MPTS model was used to identify important parameters and propose specific improvement measures.A real under-crossing tunnel project was adopted to verify the effectiveness of the MPTS.The results show that the new input parameters proposed in this paper reduce mean absolute error(MAE)by 20.3%and mean square error(MSE)by 46.7%of prediction results.Compared with the other three algorithms,GS-NPSO-SVR has better prediction performance.Through Sobol sensitivity analysis,previous settlement,ESR and MFC in fully weathered mudstone and moderately weathered mudstone are identified as the primary parameters affecting the interlayer soil settlement.The improvement measures based on analysis results reduce the accumulated settlement by 79.97%.The developed MPTS model can accurately predict the interlayer soil settlement and provide guidance for water stopping or reinforcement construction.展开更多
Hard disk drives(HDDs)serve as the primary storage devices in modern data centers.Once a failure occurs,it often leads to severe data loss,significantly degrading the reliability of storage systems.Numerous studies ha...Hard disk drives(HDDs)serve as the primary storage devices in modern data centers.Once a failure occurs,it often leads to severe data loss,significantly degrading the reliability of storage systems.Numerous studies have proposed machine learning-based HDD failure prediction models.However,the Self-Monitoring,Analysis,and Reporting Technology(SMART)attributes differ across HDD manufacturers.We define hard drives of the same brand and model as homogeneous HDD groups,and those from different brands or models as heterogeneous HDD groups.In practical engineering scenarios,a data center is often composed of a heterogeneous population of HDDs,spanning multiple vendors and models.Existing research predominantly focuses on homogeneous datasets,ignoring the model’s generalization capability across heterogeneous HDDs.As a result,HDD models with limited samples often suffer from poor training effectiveness and prediction performance.To address this issue,we investigate generalizable SMART predictors across heterogeneous HDD groups.By extracting time-series features within a fixed sliding time window,we propose a Heterogeneous Disk Failure Prediction Method based on Time Series Features(HDFPM)framework.This method is adaptable to HDD models with limited sample sizes,thereby enhancing its applicability and robustness across diverse drive populations.Experimental results show that the proposed model achieves an F1-score of 0.9518 when applied to two different Seagate HDD models,while maintaining the False Positive Rate(FPR)below 1%.After incorporating the Complexity-Ratio Dynamic Time Warping(CDTW)based feature enhancement method,the best prediction model achieves a True Positive Rate(TPR)of up to 0.93 between the two models.For next-day failure prediction across various Seagate models,the model achieves an F1-score of up to 0.8792.Moreover,the experimental results also show that within the same brand,the higher the proportion of shared SMART attributes across different models,the better the prediction performance.In addition,HDFPMdemonstrates the best stability andmost significant performance in heterogeneous environments.展开更多
To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a mult...To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a multivariate sequence-to-sequence prediction model integrating a Long Short-Term Memory(LSTM)encoder,a Gated Recurrent Unit(GRU)decoder,and a multi-head attention mechanism.This approach enhances prediction accuracy and robustness across different control modes and load spectra by leveraging multi-channel inputs and cross-variable feature interactions,thereby capturing both short-term high-frequency dynamics and long-term slow drift characteristics.Experiments using long-term data from real test benches demonstrate that the model achieves a stable MSE below 0.01 on the validation set,with MAE and RMSE of approximately 0.018 and 0.052,respectively,and a coefficient of determination reaching 0.98.This significantly outperforms traditional identification methods and single RNN models.Sensitivity analysis indicates that a prediction stride of 10 achieves an optimal balance between accuracy and computational overhead.Ablation experiments validated the contribution of multi-head attention and decoder architecture to enhancing cross-variable coupling modeling capabilities.This model can be applied to residualdriven early warning in health monitoring,and risk assessment with scheme optimization in test design.It enables near-real-time deployment feasibility,providing a practical data-driven technical pathway for reliability assurance in advanced equipment.展开更多
Multivariate time series forecasting plays a crucial role in decision-making for systems like energy grids and transportation networks,where temporal patterns emerge across diverse scales from short-term fluctuations ...Multivariate time series forecasting plays a crucial role in decision-making for systems like energy grids and transportation networks,where temporal patterns emerge across diverse scales from short-term fluctuations to long-term trends.However,existing Transformer-based methods often process data at a single resolution or handle multiple scales independently,overlooking critical cross-scale interactions that influence prediction accuracy.To address this gap,we introduce the Hierarchical Attention Transformer(HAT),which enables direct information exchange between temporal hierarchies through a novel cross-scale attention mechanism.HAT extracts multi-scale features using hierarchical convolutional-recurrent blocks,fuses them via temperature-controlled mechanisms,and optimizes gradient flow with residual connections for stable training.Evaluations on eight benchmark datasets show HAT outperforming state-of-the-art baselines,with average reductions of 8.2%in MSE and 7.5%in MAE across horizons,while achieving a 6.1×training speedup over patch-based methods.These advancements highlight HAT’s potential for applications requiring multi-resolution temporal modeling.展开更多
Predicting the behavior of renewable energy systems requires models capable of generating accurate forecasts from limited historical data,a challenge that becomes especially pronounced when commissioning new facil-iti...Predicting the behavior of renewable energy systems requires models capable of generating accurate forecasts from limited historical data,a challenge that becomes especially pronounced when commissioning new facil-ities where operational records are scarce.This review aims to synthesize recent progress in data-efficient deep learning approaches for addressing such“cold-start”forecasting problems.It primarily covers three interrelated domains—solar photovoltaic(PV),wind power,and electrical load forecasting—where data scarcity and operational variability are most critical,while also including representative studies on hydropower and carbon emission prediction to provide a broader systems perspective.To this end,we examined trends from over 150 predominantly peer-reviewed studies published between 2019 and mid-2025,highlighting advances in zero-shot and few-shot meta-learning frameworks that enable rapid model adaptation with minimal labeled data.Moreover,transfer learning approaches combined with spatiotemporal graph neural networks have been employed to transfer knowledge from existing energy assets to new,data-sparse environments,effectively capturing hidden dependencies among geographic features,meteorological dynamics,and grid structures.Synthetic data generation has further proven valuable for expanding training samples and mitigating overfitting in cold-start scenarios.In addition,large language models and explainable artificial intelligence(XAI)—notably conversational XAI systems—have been used to interpret and communicate complex model behaviors in accessible terms,fostering operator trust from the earliest deployment stages.By consolidating methodological advances,unresolved challenges,and open-source resources,this review provides a coherent overview of deep learning strategies that can shorten the data-sparse ramp-up period of new energy infrastructures and accelerate the transition toward resilient,low-carbon electricity grids.展开更多
Temporal alignment of multisensor time series(MTS)is a critical prerequisite for accurate modeling and optimal control in subsequent data-driven applications.Nevertheless,many approaches frequently neglect to consider...Temporal alignment of multisensor time series(MTS)is a critical prerequisite for accurate modeling and optimal control in subsequent data-driven applications.Nevertheless,many approaches frequently neglect to consider the complex interdependencies between different sensors in MTS,and temporal alignment in many methods is typically treated as an isolated task disconnected from the downstream objectives,leading to unsatisfactory performances in follow-up applications.To address these challenges,this paper proposes a novel knowledge graph(KG)-guided iterative-updating graph neural network(GNN)for time-delay estimation(TDE)in MTS.Initially,a domain-specific KG is constructed from domain mechanism knowledge,providing a foundation for GNN's initialization.Next,capitalizing on the inherent structure of the graph topology,a GNN-based TDE method is developed.Then,a customized loss function is constructed,which synthesizes both the performances of downstream tasks and graph-based constraints.Moreover,an innovative algorithm for GNN structure learning and iterative-updating is proposed to renovate the graph structure further.Finally,experimental results across various regression and classification tasks on numerical simulation,public datasets,and the real blast furnace ironmaking dataset demonstrate that the proposed method can achieve accurate temporal alignment of MTS.展开更多
Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies ...Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies prevalent in real-world temporal data.This comprehensive survey reviews state-of-the-art DL architectures forTSF,focusing on four core paradigms:(1)ConvolutionalNeuralNetworks(CNNs),adept at extracting localized temporal features;(2)Recurrent Neural Networks(RNNs)and their advanced variants(LSTM,GRU),designed for sequential dependency modeling;(3)Graph Neural Networks(GNNs),specialized for forecasting structured relational data with spatial-temporal dependencies;and(4)Transformer-based models,leveraging self-attention mechanisms to capture global temporal patterns efficiently.We provide a rigorous analysis of the theoretical underpinnings,recent algorithmic advancements(e.g.,TCNs,attention mechanisms,hybrid architectures),and practical applications of each framework,supported by extensive benchmark datasets(e.g.,ETT,traffic flow,financial indicators)and standardized evaluation metrics(MAE,MSE,RMSE).Critical challenges,including handling irregular sampling intervals,integrating domain knowledge for robustness,and managing computational complexity,are thoroughly discussed.Emerging research directions highlighted include diffusion models for uncertainty quantification,hybrid pipelines combining classical statistical and DL techniques for enhanced interpretability,quantile regression with Transformers for riskaware forecasting,and optimizations for real-time deployment.This work serves as an essential reference,consolidating methodological innovations,empirical resources,and future trends to bridge the gap between theoretical research and practical implementation needs for researchers and practitioners in the field.展开更多
Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the origin...Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.展开更多
Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on t...Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.展开更多
Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR de...Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR deformation data from the Beishan exploration tunnel(BET)test platform,the metaheuristic algorithm crested porcupine optimizer(CPO)was applied for the first time to optimize the time series of TSR deformation,and an integrated model incorporating convolutional neural network(CNN),long short-term memory network(LSTM),and attention mechanism(ATT)was proposed.This model integrates the strong feature extraction capabilities of CNN,the superior sequence prediction performance of LSTM,and the effective attention mechanism of ATT.The results show that during blasting excavation,the internal displacement of TSR exhibits a stepwise change pattern.After excavation,the internal displacement enters a phase of gradual increase,ultimately reaching a stable convergence stage.The CPO-CNN-LSTM-ATT(CPO-CLA)integrated model demonstrated excellent predictive accuracy and stability across various evaluation metrics,achieving a determination coefficient(R^(2))of 0.985.Compared to the CNN-LSTM-ATT(CLA)model,the CPO-CLA model showed a 14.1%increase in R^(2),a 61.5%decrease in root mean square error(RMSE),and a 72.9%decrease in mean absolute error(MAE).In comparison with current mainstream metaheuristic integrated models,the CPO-CLA model is better suited for predicting long-term TSR deformation.It offers high computational efficiency,accurate predictions,and expertise in optimizing large datasets.展开更多
Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including at...Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.展开更多
Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferom...Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.展开更多
Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,p...Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.展开更多
Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we prop...Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.展开更多
Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based...Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based on different normality assumptions performwell in specific scenarios,they often neglected the overall normality issue.Some feature extraction methods incorporate pre-training processes but they may not be suitable for time series anomaly detection,leading to decreased performance.Additionally,real-world time series samples are rarely free from noise,making them susceptible to outliers,which further impacts detection accuracy.To address these challenges,we propose a novel anomaly detection method called Robust One-Class Classification Detection(ROC).This approach utilizes an autoencoder(AE)to learn features while constraining the context vectors fromthe AE within a sufficiently small hypersphere,akin to One-Class Classification(OC)methods.By simultaneously optimizing two hypothetical objective functions,ROC captures various aspects of normality.We categorize the input raw time series into clean and outlier sequences,reducing the impact of outliers on compressed feature representation.Experimental results on public datasets indicate that our approach outperforms existing baselinemethods and substantially improves model robustness.展开更多
Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated...Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.展开更多
Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-spe...Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.展开更多
As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheles...As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.51578263)。
文摘Predicting surface settlement can identify potential risks associated in shield construction.However,in the construction of undercrossing existing structures,the surface settlement is minimal due to the high stiffness of the existing structure,making it unsuitable as a basis for risk assessment.Therefore,interlayer soil settlement was used as an evaluation index in this paper,which was predicted by the developed multi-parameter time series(MPTS)model.This model establishes new dataset,including time,effective stress ratio(ESR),mechanical fluctuation coefficient(MFC),and interlayer soil settlement,where ESR and MFC take into account the changing geological conditions.This study proposes a novel MPTS model,integrating grid search(GS),nonlinear particle swarm optimization(NPSO),and support vector regression(SVR)algorithms to predict interlayer soil settlement during under-crossing construction.It utilizes GS and NPSO to obtain the optimal hyperparameters for SVR.Sensitivity analysis based on MPTS model was used to identify important parameters and propose specific improvement measures.A real under-crossing tunnel project was adopted to verify the effectiveness of the MPTS.The results show that the new input parameters proposed in this paper reduce mean absolute error(MAE)by 20.3%and mean square error(MSE)by 46.7%of prediction results.Compared with the other three algorithms,GS-NPSO-SVR has better prediction performance.Through Sobol sensitivity analysis,previous settlement,ESR and MFC in fully weathered mudstone and moderately weathered mudstone are identified as the primary parameters affecting the interlayer soil settlement.The improvement measures based on analysis results reduce the accumulated settlement by 79.97%.The developed MPTS model can accurately predict the interlayer soil settlement and provide guidance for water stopping or reinforcement construction.
基金supported by the Tianjin Manufacturing High Quality Development Special Foundation(No.20232185)the Roycom Foundation(No.70306901).
文摘Hard disk drives(HDDs)serve as the primary storage devices in modern data centers.Once a failure occurs,it often leads to severe data loss,significantly degrading the reliability of storage systems.Numerous studies have proposed machine learning-based HDD failure prediction models.However,the Self-Monitoring,Analysis,and Reporting Technology(SMART)attributes differ across HDD manufacturers.We define hard drives of the same brand and model as homogeneous HDD groups,and those from different brands or models as heterogeneous HDD groups.In practical engineering scenarios,a data center is often composed of a heterogeneous population of HDDs,spanning multiple vendors and models.Existing research predominantly focuses on homogeneous datasets,ignoring the model’s generalization capability across heterogeneous HDDs.As a result,HDD models with limited samples often suffer from poor training effectiveness and prediction performance.To address this issue,we investigate generalizable SMART predictors across heterogeneous HDD groups.By extracting time-series features within a fixed sliding time window,we propose a Heterogeneous Disk Failure Prediction Method based on Time Series Features(HDFPM)framework.This method is adaptable to HDD models with limited sample sizes,thereby enhancing its applicability and robustness across diverse drive populations.Experimental results show that the proposed model achieves an F1-score of 0.9518 when applied to two different Seagate HDD models,while maintaining the False Positive Rate(FPR)below 1%.After incorporating the Complexity-Ratio Dynamic Time Warping(CDTW)based feature enhancement method,the best prediction model achieves a True Positive Rate(TPR)of up to 0.93 between the two models.For next-day failure prediction across various Seagate models,the model achieves an F1-score of up to 0.8792.Moreover,the experimental results also show that within the same brand,the higher the proportion of shared SMART attributes across different models,the better the prediction performance.In addition,HDFPMdemonstrates the best stability andmost significant performance in heterogeneous environments.
基金supported by Natural Science Foundation of China(NSFC),Grant number 5247052693.
文摘To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a multivariate sequence-to-sequence prediction model integrating a Long Short-Term Memory(LSTM)encoder,a Gated Recurrent Unit(GRU)decoder,and a multi-head attention mechanism.This approach enhances prediction accuracy and robustness across different control modes and load spectra by leveraging multi-channel inputs and cross-variable feature interactions,thereby capturing both short-term high-frequency dynamics and long-term slow drift characteristics.Experiments using long-term data from real test benches demonstrate that the model achieves a stable MSE below 0.01 on the validation set,with MAE and RMSE of approximately 0.018 and 0.052,respectively,and a coefficient of determination reaching 0.98.This significantly outperforms traditional identification methods and single RNN models.Sensitivity analysis indicates that a prediction stride of 10 achieves an optimal balance between accuracy and computational overhead.Ablation experiments validated the contribution of multi-head attention and decoder architecture to enhancing cross-variable coupling modeling capabilities.This model can be applied to residualdriven early warning in health monitoring,and risk assessment with scheme optimization in test design.It enables near-real-time deployment feasibility,providing a practical data-driven technical pathway for reliability assurance in advanced equipment.
文摘Multivariate time series forecasting plays a crucial role in decision-making for systems like energy grids and transportation networks,where temporal patterns emerge across diverse scales from short-term fluctuations to long-term trends.However,existing Transformer-based methods often process data at a single resolution or handle multiple scales independently,overlooking critical cross-scale interactions that influence prediction accuracy.To address this gap,we introduce the Hierarchical Attention Transformer(HAT),which enables direct information exchange between temporal hierarchies through a novel cross-scale attention mechanism.HAT extracts multi-scale features using hierarchical convolutional-recurrent blocks,fuses them via temperature-controlled mechanisms,and optimizes gradient flow with residual connections for stable training.Evaluations on eight benchmark datasets show HAT outperforming state-of-the-art baselines,with average reductions of 8.2%in MSE and 7.5%in MAE across horizons,while achieving a 6.1×training speedup over patch-based methods.These advancements highlight HAT’s potential for applications requiring multi-resolution temporal modeling.
文摘Predicting the behavior of renewable energy systems requires models capable of generating accurate forecasts from limited historical data,a challenge that becomes especially pronounced when commissioning new facil-ities where operational records are scarce.This review aims to synthesize recent progress in data-efficient deep learning approaches for addressing such“cold-start”forecasting problems.It primarily covers three interrelated domains—solar photovoltaic(PV),wind power,and electrical load forecasting—where data scarcity and operational variability are most critical,while also including representative studies on hydropower and carbon emission prediction to provide a broader systems perspective.To this end,we examined trends from over 150 predominantly peer-reviewed studies published between 2019 and mid-2025,highlighting advances in zero-shot and few-shot meta-learning frameworks that enable rapid model adaptation with minimal labeled data.Moreover,transfer learning approaches combined with spatiotemporal graph neural networks have been employed to transfer knowledge from existing energy assets to new,data-sparse environments,effectively capturing hidden dependencies among geographic features,meteorological dynamics,and grid structures.Synthetic data generation has further proven valuable for expanding training samples and mitigating overfitting in cold-start scenarios.In addition,large language models and explainable artificial intelligence(XAI)—notably conversational XAI systems—have been used to interpret and communicate complex model behaviors in accessible terms,fostering operator trust from the earliest deployment stages.By consolidating methodological advances,unresolved challenges,and open-source resources,this review provides a coherent overview of deep learning strategies that can shorten the data-sparse ramp-up period of new energy infrastructures and accelerate the transition toward resilient,low-carbon electricity grids.
基金supported by the Young Scientists Fund of the National Natural Science Foundation of China(62303491)the Major Program of Xiangjiang Laboratory(22XJ01005)+1 种基金the Science and Technology Innovation Program of Hunan Province(2024RC1007)the Natural Science Foundation of Hunan Province(2025JJ10007)。
文摘Temporal alignment of multisensor time series(MTS)is a critical prerequisite for accurate modeling and optimal control in subsequent data-driven applications.Nevertheless,many approaches frequently neglect to consider the complex interdependencies between different sensors in MTS,and temporal alignment in many methods is typically treated as an isolated task disconnected from the downstream objectives,leading to unsatisfactory performances in follow-up applications.To address these challenges,this paper proposes a novel knowledge graph(KG)-guided iterative-updating graph neural network(GNN)for time-delay estimation(TDE)in MTS.Initially,a domain-specific KG is constructed from domain mechanism knowledge,providing a foundation for GNN's initialization.Next,capitalizing on the inherent structure of the graph topology,a GNN-based TDE method is developed.Then,a customized loss function is constructed,which synthesizes both the performances of downstream tasks and graph-based constraints.Moreover,an innovative algorithm for GNN structure learning and iterative-updating is proposed to renovate the graph structure further.Finally,experimental results across various regression and classification tasks on numerical simulation,public datasets,and the real blast furnace ironmaking dataset demonstrate that the proposed method can achieve accurate temporal alignment of MTS.
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
基金funded by Natural Science Foundation of Heilongjiang Province,grant number LH2023F020.
文摘Deep learning(DL)has revolutionized time series forecasting(TSF),surpassing traditional statistical methods(e.g.,ARIMA)and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies prevalent in real-world temporal data.This comprehensive survey reviews state-of-the-art DL architectures forTSF,focusing on four core paradigms:(1)ConvolutionalNeuralNetworks(CNNs),adept at extracting localized temporal features;(2)Recurrent Neural Networks(RNNs)and their advanced variants(LSTM,GRU),designed for sequential dependency modeling;(3)Graph Neural Networks(GNNs),specialized for forecasting structured relational data with spatial-temporal dependencies;and(4)Transformer-based models,leveraging self-attention mechanisms to capture global temporal patterns efficiently.We provide a rigorous analysis of the theoretical underpinnings,recent algorithmic advancements(e.g.,TCNs,attention mechanisms,hybrid architectures),and practical applications of each framework,supported by extensive benchmark datasets(e.g.,ETT,traffic flow,financial indicators)and standardized evaluation metrics(MAE,MSE,RMSE).Critical challenges,including handling irregular sampling intervals,integrating domain knowledge for robustness,and managing computational complexity,are thoroughly discussed.Emerging research directions highlighted include diffusion models for uncertainty quantification,hybrid pipelines combining classical statistical and DL techniques for enhanced interpretability,quantile regression with Transformers for riskaware forecasting,and optimizations for real-time deployment.This work serves as an essential reference,consolidating methodological innovations,empirical resources,and future trends to bridge the gap between theoretical research and practical implementation needs for researchers and practitioners in the field.
基金supported in part by the Interdisciplinary Project of Dalian University(DLUXK-2023-ZD-001).
文摘Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.
基金supported by National Science and Technology Major Project(2022ZD0115701)Nanfan Special Project,CAAS(YBXM2305,YBXM2401,YBXM2402,PTXM2402)+1 种基金National Natural Science Foundation of China(42071426,42301427)the Agricultural Science and Technology Innovation Program of the Chinese Academy of Agricultural Sciences。
文摘Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.
基金supported by the China Atomic Energy Authority(CAEA)for China’s URL Development Program and the Geological Disposal Program(Grant No.FZ2105)the National Natural Science Foundation of China(Grant No.52278420).
文摘Tunnel surrounding rock(TSR)deformation exhibits time-and space-dependent behavior,making it challenging for a single prediction model to capture these characteristics over extended periods.Utilizing 8 years of TSR deformation data from the Beishan exploration tunnel(BET)test platform,the metaheuristic algorithm crested porcupine optimizer(CPO)was applied for the first time to optimize the time series of TSR deformation,and an integrated model incorporating convolutional neural network(CNN),long short-term memory network(LSTM),and attention mechanism(ATT)was proposed.This model integrates the strong feature extraction capabilities of CNN,the superior sequence prediction performance of LSTM,and the effective attention mechanism of ATT.The results show that during blasting excavation,the internal displacement of TSR exhibits a stepwise change pattern.After excavation,the internal displacement enters a phase of gradual increase,ultimately reaching a stable convergence stage.The CPO-CNN-LSTM-ATT(CPO-CLA)integrated model demonstrated excellent predictive accuracy and stability across various evaluation metrics,achieving a determination coefficient(R^(2))of 0.985.Compared to the CNN-LSTM-ATT(CLA)model,the CPO-CLA model showed a 14.1%increase in R^(2),a 61.5%decrease in root mean square error(RMSE),and a 72.9%decrease in mean absolute error(MAE).In comparison with current mainstream metaheuristic integrated models,the CPO-CLA model is better suited for predicting long-term TSR deformation.It offers high computational efficiency,accurate predictions,and expertise in optimizing large datasets.
基金supported by the Basic Science Center Project of the National Natural Science Foundation of China(42388102)the National Natural Science Foundation of China(42174030)+2 种基金the Special Fund of Hubei Luojia Laboratory(220100020)the Major Science and Technology Program for Hubei Province(2022AAA002)the Fundamental Research Funds for the Central Universities of China(2042022dx0001 and 2042023kfyq01)。
文摘Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.
基金jointly supported by the International Research Center of Big Data for Sustainable Development Goals(Grant No.CBAS2022GSP02)the National Natural Science Foundation of China(Grant Nos.42072320 and 42372264).
文摘Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.
基金support from the Fundamental Research Funds for Central Public Welfare Research Institutes(SK202324)the Central Guidance on Local Science and Technology Development Fund of Hebei Province(236Z0104G)+1 种基金the National Natural Science Foundation of China(62476078)the Geological Survey Project of China Geological Survey(G202304-2).
文摘Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.
文摘Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.
基金supported by the National Natural Science Foundation(62202118)Guizhou Province Major Project(Qiankehe Major Project[2024]014)+3 种基金Science and Scientific and Technological Research Projects from Guizhou Education Department(Qianiao ji[2023]003)Hundred-level Innovative Talent Project of Guizhou Provincial Science and Technology Department(Qiankehe Platform Talent-GCC[2023]018)Guizhou Province Major Project(Qiankehe Major Project[2024]003)Foundation of Chongqing Key Laboratory of Public Big Data Security Technology(CQKL-QJ202300001).
文摘Anomaly detection(AD)in time series data is widely applied across various industries for monitoring and security applications,emerging as a key research focus within the field of deep learning.While many methods based on different normality assumptions performwell in specific scenarios,they often neglected the overall normality issue.Some feature extraction methods incorporate pre-training processes but they may not be suitable for time series anomaly detection,leading to decreased performance.Additionally,real-world time series samples are rarely free from noise,making them susceptible to outliers,which further impacts detection accuracy.To address these challenges,we propose a novel anomaly detection method called Robust One-Class Classification Detection(ROC).This approach utilizes an autoencoder(AE)to learn features while constraining the context vectors fromthe AE within a sufficiently small hypersphere,akin to One-Class Classification(OC)methods.By simultaneously optimizing two hypothetical objective functions,ROC captures various aspects of normality.We categorize the input raw time series into clean and outlier sequences,reducing the impact of outliers on compressed feature representation.Experimental results on public datasets indicate that our approach outperforms existing baselinemethods and substantially improves model robustness.
基金funded by the Nursing project,“Clinical ability improvement project”in the First Affliated Hospital with Nanjing Medical University(JSPH-NC-2021-09).
文摘Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.
基金Funded by the Spanish Government and FEDER funds(AEI/FEDER,UE)under grant PID2021-124502OB-C42(PRESECREL)the predoctoral program“Concepción Arenal del Programa de Personal Investigador en formación Predoctoral”funded by Universidad de Cantabria and Cantabria’s Government(BOC 18-10-2021).
文摘Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.
基金supported in part by the National Natural Science Foundation of China(62176109,62476115)the Fundamental Research Funds for the Central Universities(lzujbky-2023-ey07,lzujbky-2023-it14)+1 种基金the Natural Science Foundation of Gansu Province(24JRRA488)the Supercomputing Center of Lanzhou University
文摘As a category of recurrent neural networks,echo state networks(ESNs)have been the topic of in-depth investigations and extensive applications in a diverse array of fields,with spectacular triumphs achieved.Nevertheless,the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data(e.g.,variance and covariance),while more information is neglected.In the context of information theoretic learning,correntropy demonstrates the capacity to grab more information from data.Therefore,under the guidelines of the maximum correntropy criterion,this paper proposes a correntropy-based echo state network(CESN)in which the first-order and higher-order information of data is captured,promoting robustness to noise.Furthermore,an incremental learning algorithm for the CESN is presented,which has the expertise to update the CESN when new data arrives,eliminating the need to retrain the network from scratch.Finally,experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.