AIM:To report and analyze cases of sterile intraocular inflammation(IOI)following intravitreal faricimab injections in patients treated for neovascular age-related macular degeneration(nAMD)and diabetic macular edema(...AIM:To report and analyze cases of sterile intraocular inflammation(IOI)following intravitreal faricimab injections in patients treated for neovascular age-related macular degeneration(nAMD)and diabetic macular edema(DME).METHODS:This double-center case series included nine eyes of six patients who developed uveitis after faricimab therapy.Comprehensive clinical evaluation was performed,including slit-lamp examination,intraocular pressure(IOP)measurement,fluorescein and indocyanine green angiography(ICGA),and laboratory tests.Inflammatory responses were treated with topical or systemic corticosteroids,and patients were monitored for visual acuity and inflammatory activity.RESULTS:The incidence of IOI was 0.8%per patient(Innsbruck)and 0.23%(Czechia),with inflammation typically occurring between the third and sixth injection(mean interval:10d post-injection).Inflammator y presentations ranged from anterior uveitis to posterior segment involvement.One notable case demonstrated novel choroidal hypofluorescent lesions on angiography,suggesting deeper ocular involvement.The mean patient age was 76y;five of six affected patients were female.All cases responded to local and systemic corticosteroids,with full recovery of initial visual acuity.CONCLUSION:Sterile IOI after faricimab appears to be a rare but relevant adverse event.Although the incidence falls within expected ranges for anti-vascular endothelial growth factor(anti-VEGF)agents,the observed choroidal involvement represents a potentially new safety signal.Prompt diagnosis and corticosteroid therapy are effective in all cases.Our findings support the need for vigilant post-marketing surveillance and further studies to better understand the underlying mechanisms and risk factors of faricimab-associated inflammation.展开更多
Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the origin...Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.展开更多
Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including at...Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.展开更多
The level 3 case for Ramanujan-type series has been considered as the most mysterious and the most challenging,out of all possible levels for Ramanujan-type series.This motivates the development of new techniques for ...The level 3 case for Ramanujan-type series has been considered as the most mysterious and the most challenging,out of all possible levels for Ramanujan-type series.This motivates the development of new techniques for constructing Ramanujan-type series of level 3.Chan and Liaw introduced an alternating analogue of the Borwein brothers’identity for Ramanujan-type series of level 3;subsequently,Chan,Liaw,and Tian formulated another proof of the Chan–Liaw identity,via the use of Ramanujan’s class invariant.Using the elliptic lambda function and the elliptic alpha function,we prove,via a limiting case of the Kummer–Goursat transformation,a new identity for evaluating the summands for alternating Ramanujan-type series of level 3,and we apply this new identity to prove three conjectured formulas for quadratic-irrational,Ramanujan-type series that had been discovered via numerical experiments with Maple in 2012 by Aldawoud.We also apply our identity to prove a new Ramanujan-type series of level 3 with a quartic convergence rate and quartic coefficients.展开更多
Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferom...Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.展开更多
Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,p...Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.展开更多
Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we prop...Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.展开更多
We evaluate some series with summands involving a single binomial coefficient(^6k 3k).For example,we prove that■Motivated by Galois theory,we introduce the so-called Duality Principle for irrational series of Ramanu...We evaluate some series with summands involving a single binomial coefficient(^6k 3k).For example,we prove that■Motivated by Galois theory,we introduce the so-called Duality Principle for irrational series of Ramanujan’s type or Zeilberger’s type,and apply it to find 26 new irrational series identities.For example,we conjecture that■where ■for any integer d≡0,1 (mod 4) with (d/k) the Kronecker symbol.展开更多
BACKGROUND In critical care practice,difficult airway management poses a substantial challenge,necessitating urgent intervention to ensure patient safety and optimize outcomes.Extracorporeal membrane oxygenation(ECMO)...BACKGROUND In critical care practice,difficult airway management poses a substantial challenge,necessitating urgent intervention to ensure patient safety and optimize outcomes.Extracorporeal membrane oxygenation(ECMO)is a potential rescue tool in patients with severe airway compromise,although evidence of its efficacy and safety remains limited.AIM To review the local experience of using ECMO support in patients with difficult airway management.METHODS This retrospective case series study includes patients with difficult airway management who required ECMO support at a tertiary hospital in a Middle Eastern country.RESULTS Between 2016 and 2023,a total of 13 patients required ECMO support due to challenging airway patency in the operating room.Indications for ECMO encompassed various diagnoses,including tracheal stenosis,external tracheal compression,and subglottic stenosis.Surgical interventions such as tracheal resection and anastomosis often necessitated ECMO support to maintain adequate oxygenation and hemodynamic stability.The duration of ECMO support ranged from standby mode(ECMO implantation is readily available)to several days,with relatively infrequent complications observed.Despite the challenges encountered,most patients survived hospital discharge,highlighting the effectiveness of ECMO in managing difficult airways.CONCLUSION This study underscores the crucial role of ECMO as a life-saving intervention in selected cases of difficult airway management.Further research is warranted to refine the understanding of optimal management strategies and improve outcomes in this challenging patient population.展开更多
The El Pintado 1 Silurian section in Seville Province,Spain,described by Loydell et al.(2015),has been ratified by the IUGS as the replacement GSSP for the base of the Telychian Stage,to replace the Cefn Cerig quarry ...The El Pintado 1 Silurian section in Seville Province,Spain,described by Loydell et al.(2015),has been ratified by the IUGS as the replacement GSSP for the base of the Telychian Stage,to replace the Cefn Cerig quarry section in the Llandovery area of Wales,which was found to be within a sedimentary mélange and therefore not a continuous section.No section other than El Pintado 1 has been found to be continuously fossiliferous across the Aeronian/Telychian boundary.展开更多
Supervised learning classification has arisen as a powerful tool to perform data-driven fault diagnosis in dynamical systems,achieving astonishing results.This approach assumes the availability of extensive,diverse an...Supervised learning classification has arisen as a powerful tool to perform data-driven fault diagnosis in dynamical systems,achieving astonishing results.This approach assumes the availability of extensive,diverse and labeled data corpora for train-ing.However,in some applications it may be difficult or not feasible to obtain a large and balanced dataset including enough representative instances of the fault behaviors of interest.This fact leads to the issues of data scarcity and class imbalance,greatly affecting the performance of supervised learning classifiers.Datasets from railway systems are usually both,scarce and imbalanced,turning supervised learning-based fault diagnosis into a highly challenging task.This article addresses time-series data augmentation for fault diagnosis purposes and presents two application cases in the context of railway track.The case studies employ generative adversarial networks(GAN)schemes to produce realistic synthetic samples of geometrical and structural track defects.The goal is to generate samples that enhance fault diagnosis performance;therefore,major attention was paid not only in the generation process,but also in the synthesis quality assessment,to guarantee the suitability of the samples for training of supervised learning classification models.In the first application,a convolutional classifier achieved a test accuracy of 87.5%for the train on synthetic,test on real(TSTR)scenario,while,in the second application,a fully-connected classifier achieved 96.18%in test accuracy for TSTR.The results indicate that the proposed augmentation approach produces samples having equivalent statistical characteristics and leading to a similar classification behavior as real data.展开更多
With the rapid development of the nuclear power industry on a global scale,the discharge of radioactive e uents from nuclear power plants and their impact on the environment have become important issues in radioactive...With the rapid development of the nuclear power industry on a global scale,the discharge of radioactive e uents from nuclear power plants and their impact on the environment have become important issues in radioactive waste management,radiation protection,and environmental impact assessments.-detection of nuclides requires tedious processes,such as waiting for the radioactive balance of the sample and pretreatment separation,and there is an urgent need for a method specifically designed for mixing rapid energy spectrum measurement method for nuclide samples.The analysis of hybrid-energy spectrum is proposed in this study as a new algorithm,which takes advantage of the spectral analysis of-logarithmic energy spectrum and fitting ability of Fourier series.The logarithmic energy spectrum is obtained by logarithmic conversion of the hybrid linear energy spectrum.The Fourier fitting interpolation method is used to fit the logarithmic energy spectrum numerically.Next,the interpolation points for the‘e ective high-energy window’and‘e ective low-energy window’corresponding to the highest E_(m)nuclide in the hybrid logarithmic fitted energy spectrum are set,and spline interpolation is performed three times to obtain the logarithmic fitted energy spectrum of the highest E_(m)nuclide.Finally,the logarithmic fitted spectrum of the highest E_(m)nuclide is subtracted from the hybrid logarithmic fitted spectrum to obtain a logarithmic fitted spectrum comprised of the remaining lower E_(m)nuclides.The aforementioned process is iterated in a loop to resolve the logarithmic spectra of each nuclide in the original hybrid logarithmic spectra.Then,the radioactivity of E_(m)nuclides to be measured is calculated.In the experimental tests,^(14)C,^(90)Sr,and ^(90)Y spectra,which are obtained using the Fourier fitting interpolation method are compared with the original simulated ^(14)C,^(90)Sr,and ^(90)Y spectra of GEANT4.The measured liquid scintillator data of ^(90)Sr∕^(90)Y sample source and simulated data from GEANT4 are then analyzed.Analysis of the experimental results indicates that the Fourier fitting interpolation method accurately solves ^(14)C,^(90)Sr,and ^(90)Y energy spectra,which is in good agreement with the original GEANT4 simulation.The error in ^(90)Y activity,calculated using the actual detection e ciency,is less than 10%and less than 5%when using the simulated full-spectrum detection e ciency,satisfying the experimental expectations.展开更多
Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on t...Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.展开更多
Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is us...Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is used.By using bi-temporal airborne laser scanning(ALS)data,SI can be determined using models estimated from SI observed on field plots(the direct approach)or from predicted top heights at two points in time(the height differential approach).Time series of ALS data may enhance SI determination compared to conventional methods used in operational forest inventory by providing more detailed information about the top height development.We used longitudinal data comprising spatially consistent field and ALS data collected from training plots in 1999,2010,and 2022 to determine SI using the direct and height differential approaches using all combinations of years and performed an external validation.We also evaluated the use of data assimilation.Values of root mean square error obtained from external validation were in the ranges of 16.3%–21.4%and 12.8%–20.6%of the mean fieldregistered SI for the direct approach and the height differential approach,respectively.There were no statistically significant effects of time series length or the number of points in time on the obtained accuracies.Data assimilation did not result in any substantial improvement in the obtained accuracies.Although a time series of ALS data did not yield greater accuracies compared to using only two points in time,a larger proportion of the study area could be used in ALS-based determination of SI when a time series was available.This was because areas that were unsuitable for SI determination between two points in time could be subject to SI determination based on data from another part of the time series.展开更多
Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-spe...Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.展开更多
Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated...Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.展开更多
Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples...Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples,leading to significant differences in load level detection conclusions for samples with different characteristics(trend,seasonality,cyclicality).Achieving automated,feature-adaptive,and quantifiable analysis methods remains a challenge.This paper proposes a Threshold Recognition-based Load Level Detection Algorithm(TRLLD),which effectively identifies different load level regions in samples of arbitrary size and distribution type based on sample characteristics.By utilizing distribution density uniformity,the algorithm classifies data points and ultimately obtains normalized load values.In the feature recognition step,the algorithm employs the Density Uniformity Index Based on Differences(DUID),High Load Level Concentration(HLLC),and Low Load Level Concentration(LLLC)to assess sample characteristics,which are independent of specific load values,providing a standardized perspective on features,ensuring high efficiency and strong interpretability.Compared to traditional methods,the proposed approach demonstrates better adaptive and real-time analysis capabilities.Experimental results indicate that it can effectively identify high load and low load regions in 16 groups of time series samples with different load characteristics,yielding highly interpretable results.The correlation between the DUID and sample density distribution uniformity reaches 98.08%.When introducing 10% MAD intensity noise,the maximum relative error is 4.72%,showcasing high robustness.Notably,it exhibits significant advantages in general and low sample scenarios.展开更多
To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting perfo...To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting performance of five sophisticated models:Random Forest(RF),XGBoost,Support Vector Regression(SVR),Long Short-Term Memory networks(LSTM),and ARIMA To give a thorough evaluation of the models’performance,measures including Mean Absolute Error(MAE),Root Mean Square Error(RMSE),and Mean Absolute Percentage Error(MAPE)are used.To guarantee dependable model implementation,preprocessing procedures are carried out,such as feature engineering and stationarity tests.Machine learning models outperform ARIMA in identifying complex patterns and long-term associations,but ARIMA does better with data that exhibits strong linear trends.These results provide important information about how well the model fits various forecasting scenarios,which helps develop data-driven carbon reduction programs.Predictive modeling should be incorporated into sustainable climate policy to encourage the adoption of low-carbon technologies and proactive decisionmaking.Achieving long-term environmental sustainability requires strengthening carbon trading systems,encouraging clean energy investments,and enacting stronger emission laws.In line with international climate goals,suggestions for lowering CO_(2)emissions include switching to renewable energy,increasing energy efficiency,and putting afforestation initiatives into action.展开更多
Tendon-driven robots have distinct advantages in high-dynamic performance motion and high-degree-of-freedom manipulation.However,these robots face challenges related to control complexity,intricate tendon drive paths,...Tendon-driven robots have distinct advantages in high-dynamic performance motion and high-degree-of-freedom manipulation.However,these robots face challenges related to control complexity,intricate tendon drive paths,and tendon slackness.In this study,the authors present a novel modular tendon-driven actuator design that integrates a series elastic element.The actuator incorporates a unique magnetic position sensing technology that enables observation of the length and tension of the tendon and features an exceptionally compact design.The modular architecture of the tendon-driven actuator addresses the complexity of tendon drive paths,while the tension observation functionality mitigates slackness issues.The design and modeling of the actuator are described in this paper,and a series of tests are conducted to validate the simulation model and to test the performance of the proposed actuator.The model can be used for training robot control neural networks based on simulation,thereby overcoming the challenges associated with controlling tendon-driven robots.展开更多
文摘AIM:To report and analyze cases of sterile intraocular inflammation(IOI)following intravitreal faricimab injections in patients treated for neovascular age-related macular degeneration(nAMD)and diabetic macular edema(DME).METHODS:This double-center case series included nine eyes of six patients who developed uveitis after faricimab therapy.Comprehensive clinical evaluation was performed,including slit-lamp examination,intraocular pressure(IOP)measurement,fluorescein and indocyanine green angiography(ICGA),and laboratory tests.Inflammatory responses were treated with topical or systemic corticosteroids,and patients were monitored for visual acuity and inflammatory activity.RESULTS:The incidence of IOI was 0.8%per patient(Innsbruck)and 0.23%(Czechia),with inflammation typically occurring between the third and sixth injection(mean interval:10d post-injection).Inflammator y presentations ranged from anterior uveitis to posterior segment involvement.One notable case demonstrated novel choroidal hypofluorescent lesions on angiography,suggesting deeper ocular involvement.The mean patient age was 76y;five of six affected patients were female.All cases responded to local and systemic corticosteroids,with full recovery of initial visual acuity.CONCLUSION:Sterile IOI after faricimab appears to be a rare but relevant adverse event.Although the incidence falls within expected ranges for anti-vascular endothelial growth factor(anti-VEGF)agents,the observed choroidal involvement represents a potentially new safety signal.Prompt diagnosis and corticosteroid therapy are effective in all cases.Our findings support the need for vigilant post-marketing surveillance and further studies to better understand the underlying mechanisms and risk factors of faricimab-associated inflammation.
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
基金supported in part by the Interdisciplinary Project of Dalian University(DLUXK-2023-ZD-001).
文摘Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.
基金supported by the Basic Science Center Project of the National Natural Science Foundation of China(42388102)the National Natural Science Foundation of China(42174030)+2 种基金the Special Fund of Hubei Luojia Laboratory(220100020)the Major Science and Technology Program for Hubei Province(2022AAA002)the Fundamental Research Funds for the Central Universities of China(2042022dx0001 and 2042023kfyq01)。
文摘Nonlinear variations in the coordinate time series of global navigation satellite system(GNSS) reference stations are strongly correlated with surface displacements caused by environmental loading effects,including atmospheric, hydrological, and nontidal ocean loading. Continuous improvements in the accuracy of surface mass loading products, performance of Earth models, and precise data-processing technologies have significantly advanced research on the effects of environmental loading on nonlinear variations in GNSS coordinate time series. However, owing to theoretical limitations, the lack of high spatiotemporal resolution surface mass observations, and the coupling of GNSS technology-related systematic errors, environmental loading and nonlinear GNSS reference station displacements remain inconsistent. The applicability and capability of these loading products across different regions also require further evaluation. This paper outlines methods for modeling environmental loading, surface mass loading products, and service organizations. In addition, it summarizes recent advances in applying environmental loading to address nonlinear variations in global and regional GNSS coordinate time series. Moreover, the scientific questions of existing studies are summarized, and insights into future research directions are provided. The complex nonlinear motion of reference stations is a major factor limiting the accuracy of the current terrestrial reference frame. Further refining the environmental load modeling method, establishing a surface mass distribution model with high spatiotemporal resolution and reliability, exploring other environmental load factors such as ice sheet and artificial mass-change effects, and developing an optimal data-processing model and strategy for reprocessing global reference station data consistently could contribute to the development of a millimeter-level nonlinear motion model for GNSS reference stations with actual physical significance and provide theoretical support for establishing a terrestrial reference frame with 1 mm accuracy by 2050.
基金supported by a Killam Postdoctoral Fellowship from the Killam Trusts.
文摘The level 3 case for Ramanujan-type series has been considered as the most mysterious and the most challenging,out of all possible levels for Ramanujan-type series.This motivates the development of new techniques for constructing Ramanujan-type series of level 3.Chan and Liaw introduced an alternating analogue of the Borwein brothers’identity for Ramanujan-type series of level 3;subsequently,Chan,Liaw,and Tian formulated another proof of the Chan–Liaw identity,via the use of Ramanujan’s class invariant.Using the elliptic lambda function and the elliptic alpha function,we prove,via a limiting case of the Kummer–Goursat transformation,a new identity for evaluating the summands for alternating Ramanujan-type series of level 3,and we apply this new identity to prove three conjectured formulas for quadratic-irrational,Ramanujan-type series that had been discovered via numerical experiments with Maple in 2012 by Aldawoud.We also apply our identity to prove a new Ramanujan-type series of level 3 with a quartic convergence rate and quartic coefficients.
基金jointly supported by the International Research Center of Big Data for Sustainable Development Goals(Grant No.CBAS2022GSP02)the National Natural Science Foundation of China(Grant Nos.42072320 and 42372264).
文摘Active landslides pose a significant threat globally,endangering lives and property.Effective monitoring and forecasting of displacements are essential for the timely warnings and mitigation of these events.Interferometric synthetic aperture radar(InSAR)stands out as an efficient and prevalent tool for monitoring landslide deformation and offers new prospects for displacement prediction.However,challenges such as inherent limitation of satellite viewing geometry,long revisit cycles,and limited data volume hinder its application in displacement forecasting,notably for landslides with near-north-south deformation less detectable by InSAR.To address these issues,we propose a novel strategy for predicting three-dimensional(3D)landslide displacement,integrating InSAR and global navigation satellite system(GNSS)measurements with machine learning(ML).This framework first synergizes InSAR line-of-sight(LOS)results with GNSS horizontal data to reconstruct 3D displacement time series.It then employs ML models to capture complex nonlinear relationships between external triggers,landslide evolutionary states,and 3D displacements,thus enabling accurate future deformation predictions.Utilizing four advanced ML algorithms,i.e.random forest(RF),support vector machine(SVM),long short-term memory(LSTM),and gated recurrent unit(GRU),with Bayesian optimization(BO)for hyperparameter tuning,we applied this innovative approach to the north-facing,slow-moving Xinpu landslide in the Three Gorges Reservoir Area(TGRA)of China.Leveraging over 6.5 years of Sentinel-1 satellite data and GNSS measurements,our framework demonstrates satisfactory and robust prediction performance,with an average root mean square deviation(RMSD)of 9.62 mm and a correlation coefficient(CC)of 0.996.This study presents a promising strategy for 3D displacement prediction,illustrating the efficacy of integrating InSAR monitoring with ML forecasting in enhancing landslide early warning capabilities.
基金support from the Fundamental Research Funds for Central Public Welfare Research Institutes(SK202324)the Central Guidance on Local Science and Technology Development Fund of Hebei Province(236Z0104G)+1 种基金the National Natural Science Foundation of China(62476078)the Geological Survey Project of China Geological Survey(G202304-2).
文摘Time series anomaly detection is crucial in finance,healthcare,and industrial monitoring.However,traditional methods often face challenges when handling time series data,such as limited feature extraction capability,poor temporal dependency handling,and suboptimal real-time performance,sometimes even neglecting the temporal relationships between data.To address these issues and improve anomaly detection performance by better capturing temporal dependencies,we propose an unsupervised time series anomaly detection method,VLT-Anomaly.First,we enhance the Variational Autoencoder(VAE)module by redesigning its network structure to better suit anomaly detection through data reconstruction.We introduce hyperparameters to control the weight of the Kullback-Leibler(KL)divergence term in the Evidence Lower Bound(ELBO),thereby improving the encoder module’s decoupling and expressive power in the latent space,which yields more effective latent representations of the data.Next,we incorporate transformer and Long Short-Term Memory(LSTM)modules to estimate the long-term dependencies of the latent representations,capturing both forward and backward temporal relationships and performing time series forecasting.Finally,we compute the reconstruction error by averaging the predicted results and decoder reconstruction and detect anomalies through grid search for optimal threshold values.Experimental results demonstrate that the proposed method performs superior anomaly detection on multiple public time series datasets,effectively extracting complex time-related features and enabling efficient computation and real-time anomaly detection.It improves detection accuracy and robustness while reducing false positives and false negatives.
文摘Time series forecasting is important in the fields of finance,energy,and meteorology,but traditional methods often fail to cope with the complex nonlinear and nonstationary processes of real data.In this paper,we propose the FractalNet-LSTM model,which combines fractal convolutional units with recurrent long short-term memory(LSTM)layers to model time series efficiently.To test the effectiveness of the model,data with complex structures and patterns,in particular,with seasonal and cyclical effects,were used.To better demonstrate the obtained results and the formed conclusions,the model performance was shown on the datasets of electricity consumption,sunspot activity,and Spotify stock price.The result showed that the proposed model outperforms traditional approaches at medium forecasting horizons and demonstrates high accuracy for data with long-term and cyclical dependencies.However,for financial data with high volatility,the model’s efficiency decreases at long forecasting horizons,indicating the need for further adaptation.The findings suggest further adaptation.The findings suggest that integrating fractal properties into neural network architecture improves the accuracy of time series forecasting and can be useful for developing more accurate and reliable forecasting systems in various industries.
基金Supported by the National Natural Science Foundation of China(Grant No.12371004)。
文摘We evaluate some series with summands involving a single binomial coefficient(^6k 3k).For example,we prove that■Motivated by Galois theory,we introduce the so-called Duality Principle for irrational series of Ramanujan’s type or Zeilberger’s type,and apply it to find 26 new irrational series identities.For example,we conjecture that■where ■for any integer d≡0,1 (mod 4) with (d/k) the Kronecker symbol.
文摘BACKGROUND In critical care practice,difficult airway management poses a substantial challenge,necessitating urgent intervention to ensure patient safety and optimize outcomes.Extracorporeal membrane oxygenation(ECMO)is a potential rescue tool in patients with severe airway compromise,although evidence of its efficacy and safety remains limited.AIM To review the local experience of using ECMO support in patients with difficult airway management.METHODS This retrospective case series study includes patients with difficult airway management who required ECMO support at a tertiary hospital in a Middle Eastern country.RESULTS Between 2016 and 2023,a total of 13 patients required ECMO support due to challenging airway patency in the operating room.Indications for ECMO encompassed various diagnoses,including tracheal stenosis,external tracheal compression,and subglottic stenosis.Surgical interventions such as tracheal resection and anastomosis often necessitated ECMO support to maintain adequate oxygenation and hemodynamic stability.The duration of ECMO support ranged from standby mode(ECMO implantation is readily available)to several days,with relatively infrequent complications observed.Despite the challenges encountered,most patients survived hospital discharge,highlighting the effectiveness of ECMO in managing difficult airways.CONCLUSION This study underscores the crucial role of ECMO as a life-saving intervention in selected cases of difficult airway management.Further research is warranted to refine the understanding of optimal management strategies and improve outcomes in this challenging patient population.
基金funded by project PDI2021-125585NB-I00 of the Spanish Ministry of Science,Innovation and Universities‒Agencia Estatal de Investigacion.JF thanks the Grant Agency of the Czech Republic for support of his study(GA23-06198S).
文摘The El Pintado 1 Silurian section in Seville Province,Spain,described by Loydell et al.(2015),has been ratified by the IUGS as the replacement GSSP for the base of the Telychian Stage,to replace the Cefn Cerig quarry section in the Llandovery area of Wales,which was found to be within a sedimentary mélange and therefore not a continuous section.No section other than El Pintado 1 has been found to be continuously fossiliferous across the Aeronian/Telychian boundary.
基金supported by the German Research Foundation(DFG)under the project“Efficient Sensor-Based Condition Monitoring Methodology for the Detection and Localization of Faults on the Railway Track(ConMoRAIL)”,Grant No.515687155.
文摘Supervised learning classification has arisen as a powerful tool to perform data-driven fault diagnosis in dynamical systems,achieving astonishing results.This approach assumes the availability of extensive,diverse and labeled data corpora for train-ing.However,in some applications it may be difficult or not feasible to obtain a large and balanced dataset including enough representative instances of the fault behaviors of interest.This fact leads to the issues of data scarcity and class imbalance,greatly affecting the performance of supervised learning classifiers.Datasets from railway systems are usually both,scarce and imbalanced,turning supervised learning-based fault diagnosis into a highly challenging task.This article addresses time-series data augmentation for fault diagnosis purposes and presents two application cases in the context of railway track.The case studies employ generative adversarial networks(GAN)schemes to produce realistic synthetic samples of geometrical and structural track defects.The goal is to generate samples that enhance fault diagnosis performance;therefore,major attention was paid not only in the generation process,but also in the synthesis quality assessment,to guarantee the suitability of the samples for training of supervised learning classification models.In the first application,a convolutional classifier achieved a test accuracy of 87.5%for the train on synthetic,test on real(TSTR)scenario,while,in the second application,a fully-connected classifier achieved 96.18%in test accuracy for TSTR.The results indicate that the proposed augmentation approach produces samples having equivalent statistical characteristics and leading to a similar classification behavior as real data.
基金supported by the National Natural Science Foundation of China(No.12005026)。
文摘With the rapid development of the nuclear power industry on a global scale,the discharge of radioactive e uents from nuclear power plants and their impact on the environment have become important issues in radioactive waste management,radiation protection,and environmental impact assessments.-detection of nuclides requires tedious processes,such as waiting for the radioactive balance of the sample and pretreatment separation,and there is an urgent need for a method specifically designed for mixing rapid energy spectrum measurement method for nuclide samples.The analysis of hybrid-energy spectrum is proposed in this study as a new algorithm,which takes advantage of the spectral analysis of-logarithmic energy spectrum and fitting ability of Fourier series.The logarithmic energy spectrum is obtained by logarithmic conversion of the hybrid linear energy spectrum.The Fourier fitting interpolation method is used to fit the logarithmic energy spectrum numerically.Next,the interpolation points for the‘e ective high-energy window’and‘e ective low-energy window’corresponding to the highest E_(m)nuclide in the hybrid logarithmic fitted energy spectrum are set,and spline interpolation is performed three times to obtain the logarithmic fitted energy spectrum of the highest E_(m)nuclide.Finally,the logarithmic fitted spectrum of the highest E_(m)nuclide is subtracted from the hybrid logarithmic fitted spectrum to obtain a logarithmic fitted spectrum comprised of the remaining lower E_(m)nuclides.The aforementioned process is iterated in a loop to resolve the logarithmic spectra of each nuclide in the original hybrid logarithmic spectra.Then,the radioactivity of E_(m)nuclides to be measured is calculated.In the experimental tests,^(14)C,^(90)Sr,and ^(90)Y spectra,which are obtained using the Fourier fitting interpolation method are compared with the original simulated ^(14)C,^(90)Sr,and ^(90)Y spectra of GEANT4.The measured liquid scintillator data of ^(90)Sr∕^(90)Y sample source and simulated data from GEANT4 are then analyzed.Analysis of the experimental results indicates that the Fourier fitting interpolation method accurately solves ^(14)C,^(90)Sr,and ^(90)Y energy spectra,which is in good agreement with the original GEANT4 simulation.The error in ^(90)Y activity,calculated using the actual detection e ciency,is less than 10%and less than 5%when using the simulated full-spectrum detection e ciency,satisfying the experimental expectations.
基金supported by National Science and Technology Major Project(2022ZD0115701)Nanfan Special Project,CAAS(YBXM2305,YBXM2401,YBXM2402,PTXM2402)+1 种基金National Natural Science Foundation of China(42071426,42301427)the Agricultural Science and Technology Innovation Program of the Chinese Academy of Agricultural Sciences。
文摘Timely identification and forecast of maize tasseling date(TD)are very important for agronomic management,yield prediction,and crop phenotype estimation.Remote sensing-based phenology monitoring has mostly relied on time series spectral index data of the complete growth season.A recent development in maize phenology detection research is to use canopy height(CH)data instead of spectral indices,but its robustness in multiple treatments and stages has not been confirmed.Meanwhile,because data of a complete growth season are needed,the need for timely in-season TD identification remains unmet.This study proposed an approach to timely identify and forecast the maize TD.We obtained RGB and light detection and ranging(Li DAR)data using the unmanned aerial vehicle platform over plots of different maize varieties under multiple treatments.After CH estimation,the feature points(inflection point)from the Logistic curve of the CH time series were extracted as TD.We examined the impact of various independent variables(day of year vs.accumulated growing degree days(AGDD)),sensors(RGB and Li DAR),time series denoise methods,different feature points,and temporal resolution on TD identification.Lastly,we used early CH time series data to predict height growth and further forecast TD.The results showed that using the 99th percentile of plot scale digital surface model and the minimum digital terrain model from Li DAR to estimate maize CH was the most stable across treatments and stages(R~2:0.928 to0.943).For TD identification,the best performance was achieved by using Li DAR data with AGDD as the independent variable,combined with the knee point method,resulting in RMSE of 2.95 d.The high accuracy was maintained at temporal resolutions as coarse as 14 d.TD forecast got more accurate as the CH time series extended.The optimal timing for forecasting TD was when the CH exceeded half of its maximum.Using only Li DAR CH data below 1.6 m and empirical growth rate estimates,the forecasted TD showed an RMSE of 3.90 d.In conclusion,this study exploited the growth characteristics of maize height to provide a practical approach for the timely identification and forecast of maize TD.
基金part of the Centre for Research-based Innovation SmartForest:Bringing Industry 4.0 to the Norwegian forest sector(NFR SFI project no.309671,smartforest.no)。
文摘Site index(SI)is determined from the top height development and is a proxy for forest productivity,defined as the expected top height for a given species at a certain index age.In Norway,an index age of 40 years is used.By using bi-temporal airborne laser scanning(ALS)data,SI can be determined using models estimated from SI observed on field plots(the direct approach)or from predicted top heights at two points in time(the height differential approach).Time series of ALS data may enhance SI determination compared to conventional methods used in operational forest inventory by providing more detailed information about the top height development.We used longitudinal data comprising spatially consistent field and ALS data collected from training plots in 1999,2010,and 2022 to determine SI using the direct and height differential approaches using all combinations of years and performed an external validation.We also evaluated the use of data assimilation.Values of root mean square error obtained from external validation were in the ranges of 16.3%–21.4%and 12.8%–20.6%of the mean fieldregistered SI for the direct approach and the height differential approach,respectively.There were no statistically significant effects of time series length or the number of points in time on the obtained accuracies.Data assimilation did not result in any substantial improvement in the obtained accuracies.Although a time series of ALS data did not yield greater accuracies compared to using only two points in time,a larger proportion of the study area could be used in ALS-based determination of SI when a time series was available.This was because areas that were unsuitable for SI determination between two points in time could be subject to SI determination based on data from another part of the time series.
基金Funded by the Spanish Government and FEDER funds(AEI/FEDER,UE)under grant PID2021-124502OB-C42(PRESECREL)the predoctoral program“Concepción Arenal del Programa de Personal Investigador en formación Predoctoral”funded by Universidad de Cantabria and Cantabria’s Government(BOC 18-10-2021).
文摘Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events,posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing,which frequently leads to the development of large and complex models.Inspired by the success of Large Language Models(LLMs),transformer-based foundation models have been developed for time series(TSFM).These models have been proven to reconstruct time series in a zero-shot manner,being able to capture different patterns that effectively characterize time series.This paper proposes the use of TSFM to generate embeddings of the input data space,making them more interpretable for machine learning models.To evaluate the effectiveness of our approach,we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines.We test the models trained with both the full training dataset and only 10%of the training samples.Our results show that training simple models,such as support vector regressors or neural networks,with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios,where data is scarce.This suggests a promising alternative to complex deep learning architectures,particularly in industrial contexts with limited labeled data.
基金funded by the Nursing project,“Clinical ability improvement project”in the First Affliated Hospital with Nanjing Medical University(JSPH-NC-2021-09).
文摘Objectives:This study aimed to explore the characteristics of outpatient blood collection center visit fluctuation and nursing workforce allocation based on a time series model,and the application effect was evaluated.Methods:To enhance the efficiency of phlebotomy at the hospital outpatient window and improve patient satisfaction,the First Affliated Hospital with Nanjing Medical University implemented a time series analysis model in 2024 to optimize nursing staff allocation.The management team was led by a head nurse of the outpatient blood collection department with extensive experience.It included one director of the nursing department,six senior clinical nurses,one informatics expert,and one nursing master's degree holder.Retrospective time-series data from the hospital's smart blood collection system(including hourly blood collection volumes and waiting times)were extracted between January 2020 and December 2023.Time series analysis was used to identify annual,seasonal,monthly,and hourly variation patterns in blood collection volumes.Seasonal decomposition and the Autoregressive Integrated Moving Average Model(ARIMA)were employed to forecast blood collection fluctuations for 2024 and facilitate dynamic scheduling.A comparison was conducted to evaluate differences in blood collection efficiency and patient satisfaction before(January-June 2023)and after(January-June 2024)implementing the dynamic scheduling model based on the time series analysis and forecasting.Results:Visit volumes showed periodicity and slow growth,peaking every second and third quarter of the year and daily at 8:00-9:00 a.m.and 2:00-3:00 p.m.The ARIMA model demonstrated a good fit(R2=0.692,mean absolute percentage error=8.28%).After adjusting the nursing staff allocation based on the fluctuation characteristics of the number of phlebotomy per hour in the time series analysis model,at the peak period of the blood collection window,at least three nurses,one mobile nurse and two volunteers were added.The number of phlebotomy per hour increased from 289.74±54.55 to 327.53±37.84 person-time(t=-10.041,P<0.01),waiting time decreased from 5.79±2.68 to 4.01±0.46 min(t=11.531,P<0.01),and satisfaction rose from 92.7%to 97.3%(χ^(2)=6.877,P<0.05).Conclusions:Based on the time series analysis method,it is helpful for nursing managers to accurately allocate human resources and optimize the efficiency of outpatient service resources by mining the special change rule of the outpatient blood collection window and predicting the future fluctuation trend.
文摘Load time series analysis is critical for resource management and optimization decisions,especially automated analysis techniques.Existing research has insufficiently interpreted the overall characteristics of samples,leading to significant differences in load level detection conclusions for samples with different characteristics(trend,seasonality,cyclicality).Achieving automated,feature-adaptive,and quantifiable analysis methods remains a challenge.This paper proposes a Threshold Recognition-based Load Level Detection Algorithm(TRLLD),which effectively identifies different load level regions in samples of arbitrary size and distribution type based on sample characteristics.By utilizing distribution density uniformity,the algorithm classifies data points and ultimately obtains normalized load values.In the feature recognition step,the algorithm employs the Density Uniformity Index Based on Differences(DUID),High Load Level Concentration(HLLC),and Low Load Level Concentration(LLLC)to assess sample characteristics,which are independent of specific load values,providing a standardized perspective on features,ensuring high efficiency and strong interpretability.Compared to traditional methods,the proposed approach demonstrates better adaptive and real-time analysis capabilities.Experimental results indicate that it can effectively identify high load and low load regions in 16 groups of time series samples with different load characteristics,yielding highly interpretable results.The correlation between the DUID and sample density distribution uniformity reaches 98.08%.When introducing 10% MAD intensity noise,the maximum relative error is 4.72%,showcasing high robustness.Notably,it exhibits significant advantages in general and low sample scenarios.
文摘To address the global issue of climate change and create focused mitigation plans,accurate CO_(2)emissions forecasting is essential.Using CO_(2)emissions data from 1990 to 2023,this study assesses the predicting performance of five sophisticated models:Random Forest(RF),XGBoost,Support Vector Regression(SVR),Long Short-Term Memory networks(LSTM),and ARIMA To give a thorough evaluation of the models’performance,measures including Mean Absolute Error(MAE),Root Mean Square Error(RMSE),and Mean Absolute Percentage Error(MAPE)are used.To guarantee dependable model implementation,preprocessing procedures are carried out,such as feature engineering and stationarity tests.Machine learning models outperform ARIMA in identifying complex patterns and long-term associations,but ARIMA does better with data that exhibits strong linear trends.These results provide important information about how well the model fits various forecasting scenarios,which helps develop data-driven carbon reduction programs.Predictive modeling should be incorporated into sustainable climate policy to encourage the adoption of low-carbon technologies and proactive decisionmaking.Achieving long-term environmental sustainability requires strengthening carbon trading systems,encouraging clean energy investments,and enacting stronger emission laws.In line with international climate goals,suggestions for lowering CO_(2)emissions include switching to renewable energy,increasing energy efficiency,and putting afforestation initiatives into action.
基金supported in part by the National Key R&D Program of China under Grant 2024YFB4707900the National Natural Science Foundation of China under Grant 91948302 and Grant 52021003.
文摘Tendon-driven robots have distinct advantages in high-dynamic performance motion and high-degree-of-freedom manipulation.However,these robots face challenges related to control complexity,intricate tendon drive paths,and tendon slackness.In this study,the authors present a novel modular tendon-driven actuator design that integrates a series elastic element.The actuator incorporates a unique magnetic position sensing technology that enables observation of the length and tension of the tendon and features an exceptionally compact design.The modular architecture of the tendon-driven actuator addresses the complexity of tendon drive paths,while the tension observation functionality mitigates slackness issues.The design and modeling of the actuator are described in this paper,and a series of tests are conducted to validate the simulation model and to test the performance of the proposed actuator.The model can be used for training robot control neural networks based on simulation,thereby overcoming the challenges associated with controlling tendon-driven robots.