The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through acceler...The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through accelerated life testing.In the absence of lifetime data,the hidden long-term correlation between performance degradation data is challenging to mine effectively,which is the main factor that restricts the prediction precision and engineering application of the residual life prediction method.To address this problem,a novel method based on the multi-layer perception neural network and bidirectional long short-term memory network is proposed.Firstly,a nonlinear health indicator(HI)calculation method based on kernel principal component analysis(KPCA)and exponential weighted moving average(EWMA)is designed.Then,using the raw vibration data and HI,a multi-layer perceptron(MLP)neural network is trained to further calculate the HI of the online bearing in real time.Furthermore,The bidirectional long short-term memory model(BiLSTM)optimized by particle swarm optimization(PSO)is used to mine the time series features of HI and predict the remaining service life.Performance verification experiments and comparative experiments are carried out on the XJTU-SY bearing open dataset.The research results indicate that this method has an excellent ability to predict future HI and remaining life.展开更多
Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The ma...Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The main objective of our work is to predict the market performance of the Dhaka Stock Exchange (DSE) on day closing price using different Deep Learning techniques. In this study, we have used the LSTM (Long Short-Term Memory) network to forecast the data of DSE for the convenience of shareholders. We have enforced LSTM networks to train data as well as forecast the future time series that has differentiated with test data. We have computed the Root Mean Square Error (RMSE) value to scrutinize the error between the forecasted value and test data that diminished the error by updating the LSTM networks. As a consequence of the renovation of the network, the LSTM network provides tremendous performance which outperformed the existing works to predict stock market prices.展开更多
A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan....A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan. In addition, there is still a lack of tailored health estimations for fast-charging batteries;most existing methods are applicable at lower charging rates. This paper proposes a novel method for estimating the health of lithium-ion batteries, which is tailored for multi-stage constant current-constant voltage fast-charging policies. Initially, short charging segments are extracted by monitoring current switches,followed by deriving voltage sequences using interpolation techniques. Subsequently, a graph generation layer is used to transform the voltage sequence into graphical data. Furthermore, the integration of a graph convolution network with a long short-term memory network enables the extraction of information related to inter-node message transmission, capturing the key local and temporal features during the battery degradation process. Finally, this method is confirmed by utilizing aging data from 185 cells and 81 distinct fast-charging policies. The 4-minute charging duration achieves a balance between high accuracy in estimating battery state of health and low data requirements, with mean absolute errors and root mean square errors of 0.34% and 0.66%, respectively.展开更多
Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections an...Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections and geologic disposal of nuclear waste.Such activities are expected to rise in the future making it necessary to assess their short-and long-term safety.Here,a new machine learning(ML)approach to model pore pressure and fault displacements in response to high-pressure fluid injection cycles is developed.The focus is on fault behavior near the injection borehole.To capture the temporal dependencies in the data,long short-term memory(LSTM)networks are utilized.To prevent error accumulation within the forecast window,four critical measures to train a robust LSTM model for predicting fault response are highlighted:(i)setting an appropriate value of LSTM lag,(ii)calibrating the LSTM cell dimension,(iii)learning rate reduction during weight optimization,and(iv)not adopting an independent injection cycle as a validation set.Several numerical experiments were conducted,which demonstrated that the ML model can capture peaks in pressure and associated fault displacement that accompany an increase in fluid injection.The model also captured the decay in pressure and displacement during the injection shut-in period.Further,the ability of an ML model to highlight key changes in fault hydromechanical activation processes was investigated,which shows that ML can be used to monitor risk of fault activation and leakage during high pressure fluid injections.展开更多
An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models...An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models, this paper proposes a dynamic prediction model of landslide displacement based on singular spectrum analysis(SSA) and stack long short-term memory(SLSTM) network. The SSA is used to decompose the landslide accumulated displacement time series data into trend term and periodic term displacement subsequences. A cubic polynomial function is used to predict the trend term displacement subsequence, and the SLSTM neural network is used to predict the periodic term displacement subsequence. At the same time, the Bayesian optimization algorithm is used to determine that the SLSTM network input sequence length is 12 and the number of hidden layer nodes is 18. The SLSTM network is updated by adding predicted values to the training set to achieve dynamic displacement prediction. Finally, the accumulated landslide displacement is obtained by superimposing the predicted value of each displacement subsequence. The proposed model was verified on the Xintan landslide in Hubei Province, China. The results show that when predicting the displacement of the periodic term, the SLSTM network has higher prediction accuracy than the support vector machine(SVM) and auto regressive integrated moving average(ARIMA). The mean relative error(MRE) is reduced by 4.099% and 3.548% respectively, while the root mean square error(RMSE) is reduced by 5.830 mm and 3.854 mm respectively. It is concluded that the SLSTM network model can better simulate the dynamic characteristics of landslides.展开更多
There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement an...There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement and time series of a landslide.The second one is the dynamic evolution of a landslide,which could not be feasibly simulated simply by traditional prediction models.In this paper,a dynamic model of displacement prediction is introduced for composite landslides based on a combination of empirical mode decomposition with soft screening stop criteria(SSSC-EMD)and deep bidirectional long short-term memory(DBi-LSTM)neural network.In the proposed model,the time series analysis and SSSC-EMD are used to decompose the observed accumulated displacements of a slope into three components,viz.trend displacement,periodic displacement,and random displacement.Then,by analyzing the evolution pattern of a landslide and its key factors triggering landslides,appropriate influencing factors are selected for each displacement component,and DBi-LSTM neural network to carry out multi-datadriven dynamic prediction for each displacement component.An accumulated displacement prediction has been obtained by a summation of each component.For accuracy verification and engineering practicability of the model,field observations from two known landslides in China,the Xintan landslide and the Bazimen landslide were collected for comparison and evaluation.The case study verified that the model proposed in this paper can better characterize the"stepwise"deformation characteristics of a slope.As compared with long short-term memory(LSTM)neural network,support vector machine(SVM),and autoregressive integrated moving average(ARIMA)model,DBi-LSTM neural network has higher accuracy in predicting the periodic displacement of slope deformation,with the mean absolute percentage error reduced by 3.063%,14.913%,and 13.960%respectively,and the root mean square error reduced by 1.951 mm,8.954 mm and 7.790 mm respectively.Conclusively,this model not only has high prediction accuracy but also is more stable,which can provide new insight for practical landslide prevention and control engineering.展开更多
Holter usually monitors electrocardiogram(ECG)signals for more than 24 hours to capture short-lived cardiac abnormalities.In view of the large amount of Holter data and the fact that the normal part accounts for the m...Holter usually monitors electrocardiogram(ECG)signals for more than 24 hours to capture short-lived cardiac abnormalities.In view of the large amount of Holter data and the fact that the normal part accounts for the majority,it is reasonable to design an algorithm that can automatically eliminate normal data segments as much as possible without missing any abnormal data segments,and then take the left segments to the doctors or the computer programs for further diagnosis.In this paper,we propose a preliminary abnormal segment screening method for Holter data.Based on long short-term memory(LSTM)networks,the prediction model is established and trained with the normal data of a monitored object.Then,on the basis of kernel density estimation,we learn the distribution law of prediction errors after applying the trained LSTM model to the regular data.Based on these,the preliminary abnormal ECG segment screening analysis is carried out without R wave detection.Experiments on the MIT-BIH arrhythmia database show that,under the condition of ensuring that no abnormal point is missed,53.89% of normal segments can be effectively obviated.This work can greatly reduce the workload of subsequent further processing.展开更多
Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented...Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented.Firstly,a function excluding invalid and abnormal data is established to distinguish TBM operating state,and a feature selection method based on the SelectKBest algorithm is proposed.Accordingly,ten features that are most closely related to the cutter-head torque are selected as input variables,which,in descending order of influence,include the sum of motor torque,cutter-head power,sum of motor power,sum of motor current,advance rate,cutter-head pressure,total thrust force,penetration rate,cutter-head rotational velocity,and field penetration index.Secondly,a real-time cutterhead torque prediction model’s structure is developed,based on the bidirectional long short-term memory(BLSTM)network integrating the dropout algorithm to prevent overfitting.Then,an algorithm to optimize hyperparameters of model based on Bayesian and cross-validation is proposed.Early stopping and checkpoint algorithms are integrated to optimize the training process.Finally,a BLSTMbased real-time cutter-head torque prediction model is developed,which fully utilizes the previous time-series tunneling information.The mean absolute percentage error(MAPE)of the model in the verification section is 7.3%,implying that the presented model is suitable for real-time cutter-head torque prediction.Furthermore,an incremental learning method based on the above base model is introduced to improve the adaptability of the model during the TBM tunneling.Comparison of the prediction performance between the base and incremental learning models in the same tunneling section shows that:(1)the MAPE of the predicted results of the BLSTM-based real-time cutter-head torque prediction model remains below 10%,and both the coefficient of determination(R^(2))and correlation coefficient(r)between measured and predicted values exceed 0.95;and(2)the incremental learning method is suitable for realtime cutter-head torque prediction and can effectively improve the prediction accuracy and generalization capacity of the model during the excavation process.展开更多
Load forecasting can increase the efficiency of modern energy systems with built-in measuring systerms by providing a more accurate peak power shaving performance and thus more reliable control.An analysis of an integ...Load forecasting can increase the efficiency of modern energy systems with built-in measuring systerms by providing a more accurate peak power shaving performance and thus more reliable control.An analysis of an integrated CO2 heat pump and chiller system with a hot water storage system is presented in this paper.Drastic power fluctuations,which can be reduced with load forecasting,are found in historical operation records.A model that aims to forecast the ventilation system heating demand is thus established on the basis of a long short-term memory(LSTM)network.The model can successfully forecast the one hour ahead power using records of the past 48h of the system operation data and the ambient temperature.The mean absolute percentage error(MAPE)of the forecast results of the LSTM-based model is 10.70%,which is respectively 2.2%and 7.25%better than the MAPEs of the forecast results of the support vector regression based and persistence method based models.展开更多
In this paper,the recurrent neural network structure of a bidirectional long shortterm memory network(Bi-LSTM)with special memory cells that store information is used to characterize the deep features of the variation...In this paper,the recurrent neural network structure of a bidirectional long shortterm memory network(Bi-LSTM)with special memory cells that store information is used to characterize the deep features of the variation pattern between logging and seismic data.A mapping relationship model between high-frequency logging data and low-frequency seismic data is established via nonlinear mapping.The seismic waveform is infinitely approximated using the logging curve in the low-frequency band to obtain a nonlinear mapping model of this scale,which then stepwise approach the logging curve in the high-frequency band.Finally,a seismic-inversion method of nonlinear mapping multilevel well–seismic matching based on the Bi-LSTM network is developed.The characteristic of this method is that by applying the multilevel well–seismic matching process,the seismic data are stepwise matched to the scale range that is consistent with the logging curve.Further,the matching operator at each level can be stably obtained to effectively overcome the problems that occur in the well–seismic matching process,such as the inconsistency in the scale of two types of data,accuracy in extracting the seismic wavelet of the well-side seismic traces,and multiplicity of solutions.Model test and practical application demonstrate that this method improves the vertical resolution of inversion results,and at the same time,the boundary and the lateral characteristics of the sand body are well maintained to improve the accuracy of thin-layer sand body prediction and achieve an improved practical application effect.展开更多
The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-...The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.展开更多
To address the problem of underwater sound speed profile(SSP)inversion in underwater acoustic multipath channels,this paper combines deep learning and ray theory to propose an inversion method using a long short-term ...To address the problem of underwater sound speed profile(SSP)inversion in underwater acoustic multipath channels,this paper combines deep learning and ray theory to propose an inversion method using a long short-term memory(LSTM)network.Based on the equidistant characteristics of the horizontal line array,the proposed method takes the sensing matrix composed of multi-modal data,such as time difference of arrival and angle of arrival,as input,and utilizes the ability of the LSTM network to process timeseries data to mine the correlations between spatially ordered receiving array elements for sound speed profile inversion.On this basis,a time delay estimation method based on hard threshold estimation method and cross-correlation function is proposed to reduce the measurement errors of the sensing matrix and improve the anti-multipath performance.The feasibility and accuracy of the proposed method are verified through numerical simulations.Compared with the traditional optimization algorithm,the proposed algorithm better captures the nonlinear characteristics of SSP,with higher inversion accuracy and stronger noise resistance.展开更多
Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,w...Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,we applied machine learning techniques to obtain hydrodynamic and aerodynamic loads of FOWTs by measuring platform motion responses and wave-elevation sequences.First,a computational fluid dynamics(CFD)simulation model of the floating platform was established based on the dynamic fluid body interaction technique and overset grid technology.Then,a long short-term memory(LSTM)neural network model was constructed and trained to learn the nonlinear relationship between the waves,platform-motion inputs,and hydrodynamic-load outputs.The optimal model was determined after analyzing the sensitivity of parameters such as sample characteristics,network layers,and neuron numbers.Subsequently,the effectiveness of the hydrodynamic load model was validated under different simulation conditions,and the aerodynamic load calculation was completed based on the D'Alembert principle.Finally,we built a hybrid-scale FOWT model,based on the software in the loop strategy,in which the wind turbine was replaced by an actuation system.Model tests were carried out in a wave basin and the results demonstrated that the root mean square errors of the hydrodynamic and aerodynamic load measurements were 4.20%and 10.68%,respectively.展开更多
The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster predic...The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster prediction.To address the issue of insufficient exploration of the spatio-temporal characteristic of microseismic data and the challenging selection of the optimal time window size in spatio-temporal prediction,this paper integrates deep learning methods and theory to propose a novel coal burst spatio-temporal prediction method based on Bidirectional Long Short-Term Memory(Bi-LSTM)network.The method involves three main modules,including microseismic spatio-temporal characteristic indicators construction,temporal prediction model,and spatial prediction model.To validate the effectiveness of the proposed method,engineering application tests are conducted at a high-risk working face in the Ordos mining area of Inner Mongolia,focusing on 13 high-energy microseismic events with energy levels greater than 105 J.In terms of temporal prediction,the analysis indicates that the temporal prediction results consist of 10 strong predictions and 3 medium predictions,and there is no false alarm detected throughout the entire testing period.Moreover,compared to the traditional threshold-based coal burst temporal prediction method,the accuracy of the proposed method is increased by 38.5%.In terms of spatial prediction,the distribution of spatial prediction results for high-energy events comprises 6 strong hazard predictions,3 medium hazard predictions,and 4 weak hazard predictions.展开更多
Surface EMG contains a lot of physiological information reflecting the intention of human movement.Gesture recognition by surface EMG has been widely concerned in the field of human-computer interaction and rehabilita...Surface EMG contains a lot of physiological information reflecting the intention of human movement.Gesture recognition by surface EMG has been widely concerned in the field of human-computer interaction and rehabilitation.At present,most studies on gesture recognition based on surface EMG signal are obtained by discrete separation method,ignoring continuous natural motion.A gesture recognition method of surface EMG based on improved long short-term memory network is proposed.sEMG sensors are rationally arranged according to physiological structure and muscle function.In this paper,the finger curvature is used to describe the gesture state,and the gesture at every moment can be represented by the set of different finger curvature,so as to realize continuous gesture recognition.Finally,the proposed gesture recognition model is tested on Ninapro(a large gesture recognition database).The results show that the proposed method can effectively improve the representation mining ability of surface EMG signal,and provide reference for deep learning modeling of human gesture recognition.展开更多
Joint operation optimization for electric vehicles(EVs)and on-site or adjacent photovoltaic generation(PVG)are pivotal to maintaining the security and economics of the operation of the power system concerned.Conventio...Joint operation optimization for electric vehicles(EVs)and on-site or adjacent photovoltaic generation(PVG)are pivotal to maintaining the security and economics of the operation of the power system concerned.Conventional offline optimization algorithms lack real-time applicability due to uncertainties involved in the charging service of an EV charging station(EVCS).Firstly,an optimization model for real-time EV charging strategy is proposed to address these challenges,which accounts for environmental uncertainties of an EVCS,encompassing EV arrivals,charging demands,PVG outputs,and the electricity price.Then,a scenario-based two-stage optimization approach is formulated.The scenarios of the underlying uncertain environmental factors are generated by the Bayesian long short-term memory(B-LSTM)network.Finally,numerical results substantiate the efficacy of the proposed optimization approach,and demonstrate superior profitability compared with prevalent approaches.展开更多
This paper presents a long short-term memory(LSTM)-based fault detection method to detect the multiple open-circuit switch faults of modular multilevel converter(MMC)systems with full-bridge sub-modules(FB-SMs).Eighte...This paper presents a long short-term memory(LSTM)-based fault detection method to detect the multiple open-circuit switch faults of modular multilevel converter(MMC)systems with full-bridge sub-modules(FB-SMs).Eighteen sensor signals of grid voltages,grid currents and capacitance voltages of MMC for single and multi-switch faults are collected as sampling data.The output signal characteristics of four types of single switch faults of FB-SM,as well as double switch faults in the same and different phases of MMC,are analyzed under the conditions of load variations and control command changes.A multi-layer LSTM network is devised to deeply extract the fault characteristics of MMC under different faults and operation conditions,and a Softmax layer detects the fault types.Simulation results have confirmed that the proposed LSTM-based method has better detection performance compared with three other methods:K-nearest neighbor(KNN),naive bayes(NB)and recurrent neural network(RNN).In addition,it is highly robust to model uncertainties and Gaussian noise.The validity of the proposed method is further demonstrated by experiment studies conducted on a hardware-in-the-loop(HIL)testing platform.展开更多
Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causin...Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causing magnetic storms.Consequently,it is very important to accurately predict the time period of solar flares.This paper proposes a flare prediction model,based on physical images of active solar regions.We employ X-ray flux curves recorded directly by the Geostationary Operational Environmental Satellite,used as input data for the model,allowing us to largely avoid the influence of accidental errors,effectively improving the model prediction efficiency.A model based on the X-ray flux curve can predict whether there will be a flare event within 24 hours.The reverse can also be verified by the peak of the X-ray flux curve to see if a flare has occurred within the past 24 hours.The True Positive Rate and False Positive Rate of the prediction model,based on physical images of active regions are 0.6070 and 0.2410 respectively,and the accuracy and True Skill Statistics are 0.7590 and 0.5556.Our model can effectively improve prediction efficiency compared with models based on the physical parameters of active regions or magnetic field records,providing a simple method for solar flare prediction.展开更多
Developing efficient neural network(NN)computing systems is crucial in the era of artificial intelligence(AI).Traditional von Neumann architectures have both the issues of"memory wall"and"power wall&quo...Developing efficient neural network(NN)computing systems is crucial in the era of artificial intelligence(AI).Traditional von Neumann architectures have both the issues of"memory wall"and"power wall",limiting the data transfer between memory and processing units[1,2].Compute-in-memory(CIM)technologies,particularly analogue CIM with memristor crossbars,are promising because of their high energy efficiency,computational parallelism,and integration density for NN computations[3].In practical applications,analogue CIM excels in tasks like speech recognition and image classification,revealing its unique advantages.For instance,it efficiently processes vast amounts of audio data in speech recognition,achieving high accuracy with minimal power consumption.In image classification,the high parallelism of analogue CIM significantly speeds up feature extraction and reduces processing time.With the boosting development of AI applications,the demands for computational accuracy and task complexity are rising continually.However,analogue CIM systems are limited in handling complex regression tasks with needs of precise floating-point(FP)calculations.They are primarily suited for the classification tasks with low data precision and a limited dynamic range[4].展开更多
Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict...Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict key performance indicators(PIs)of pavement,namely the international roughness index(IRI)and rutting depth(RD).Subsequently,we propose a comprehensive performance indicator for the pavement quality index(PQI),which leverages the highway performance assessment standard method,entropy weight method,and fuzzy comprehensive evaluation method.This indicator can evaluate the overall performance condition of the pavement.The data used for the model development and analysis are extracted from tests on two full-scale accelerated test tracks,called MnRoad and RIOHTrack.Six variables are used as predictors,including temperature,precipitation,total traffic volume,asphalt surface layer thickness,pavement age,and maintenance condition.Furthermore,wavelet denoising is performed to analyze the impact of missing or abnormal data on the LSTM model accuracy.In comparison to a traditional autoregressive integrated moving average(ARIMAX)model,the proposed LSTM model performs better in terms of PI prediction and resiliency to noise.Finally,the overall prediction accuracy of our proposed performance indicator PQI is 93.8%.展开更多
基金supported by the National Key Research and Development Project(Grant Number 2023YFB3709601)the National Natural Science Foundation of China(Grant Numbers 62373215,62373219,62073193)+2 种基金the Key Research and Development Plan of Shandong Province(Grant Numbers 2021CXGC010204,2022CXGC020902)the Fundamental Research Funds of Shandong University(Grant Number 2021JCG008)the Natural Science Foundation of Shandong Province(Grant Number ZR2023MF100).
文摘The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through accelerated life testing.In the absence of lifetime data,the hidden long-term correlation between performance degradation data is challenging to mine effectively,which is the main factor that restricts the prediction precision and engineering application of the residual life prediction method.To address this problem,a novel method based on the multi-layer perception neural network and bidirectional long short-term memory network is proposed.Firstly,a nonlinear health indicator(HI)calculation method based on kernel principal component analysis(KPCA)and exponential weighted moving average(EWMA)is designed.Then,using the raw vibration data and HI,a multi-layer perceptron(MLP)neural network is trained to further calculate the HI of the online bearing in real time.Furthermore,The bidirectional long short-term memory model(BiLSTM)optimized by particle swarm optimization(PSO)is used to mine the time series features of HI and predict the remaining service life.Performance verification experiments and comparative experiments are carried out on the XJTU-SY bearing open dataset.The research results indicate that this method has an excellent ability to predict future HI and remaining life.
文摘Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The main objective of our work is to predict the market performance of the Dhaka Stock Exchange (DSE) on day closing price using different Deep Learning techniques. In this study, we have used the LSTM (Long Short-Term Memory) network to forecast the data of DSE for the convenience of shareholders. We have enforced LSTM networks to train data as well as forecast the future time series that has differentiated with test data. We have computed the Root Mean Square Error (RMSE) value to scrutinize the error between the forecasted value and test data that diminished the error by updating the LSTM networks. As a consequence of the renovation of the network, the LSTM network provides tremendous performance which outperformed the existing works to predict stock market prices.
基金National Key Research and Development Program of China (Grant No. 2022YFE0102700)National Natural Science Foundation of China (Grant No. 52102420)+2 种基金research project “Safe Da Batt” (03EMF0409A) funded by the German Federal Ministry of Digital and Transport (BMDV)China Postdoctoral Science Foundation (Grant No. 2023T160085)Sichuan Science and Technology Program (Grant No. 2024NSFSC0938)。
文摘A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan. In addition, there is still a lack of tailored health estimations for fast-charging batteries;most existing methods are applicable at lower charging rates. This paper proposes a novel method for estimating the health of lithium-ion batteries, which is tailored for multi-stage constant current-constant voltage fast-charging policies. Initially, short charging segments are extracted by monitoring current switches,followed by deriving voltage sequences using interpolation techniques. Subsequently, a graph generation layer is used to transform the voltage sequence into graphical data. Furthermore, the integration of a graph convolution network with a long short-term memory network enables the extraction of information related to inter-node message transmission, capturing the key local and temporal features during the battery degradation process. Finally, this method is confirmed by utilizing aging data from 185 cells and 81 distinct fast-charging policies. The 4-minute charging duration achieves a balance between high accuracy in estimating battery state of health and low data requirements, with mean absolute errors and root mean square errors of 0.34% and 0.66%, respectively.
基金supported by the US Department of Energy (DOE),the Office of Nuclear Energy,Spent Fuel and Waste Science and Technology Campaign,under Contract Number DE-AC02-05CH11231the National Energy Technology Laboratory under the award number FP00013650 at Lawrence Berkeley National Laboratory.
文摘Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections and geologic disposal of nuclear waste.Such activities are expected to rise in the future making it necessary to assess their short-and long-term safety.Here,a new machine learning(ML)approach to model pore pressure and fault displacements in response to high-pressure fluid injection cycles is developed.The focus is on fault behavior near the injection borehole.To capture the temporal dependencies in the data,long short-term memory(LSTM)networks are utilized.To prevent error accumulation within the forecast window,four critical measures to train a robust LSTM model for predicting fault response are highlighted:(i)setting an appropriate value of LSTM lag,(ii)calibrating the LSTM cell dimension,(iii)learning rate reduction during weight optimization,and(iv)not adopting an independent injection cycle as a validation set.Several numerical experiments were conducted,which demonstrated that the ML model can capture peaks in pressure and associated fault displacement that accompany an increase in fluid injection.The model also captured the decay in pressure and displacement during the injection shut-in period.Further,the ability of an ML model to highlight key changes in fault hydromechanical activation processes was investigated,which shows that ML can be used to monitor risk of fault activation and leakage during high pressure fluid injections.
基金supported by the Natural Science Foundation of Shaanxi Province under Grant 2019JQ206in part by the Science and Technology Department of Shaanxi Province under Grant 2020CGXNG-009in part by the Education Department of Shaanxi Province under Grant 17JK0346。
文摘An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models, this paper proposes a dynamic prediction model of landslide displacement based on singular spectrum analysis(SSA) and stack long short-term memory(SLSTM) network. The SSA is used to decompose the landslide accumulated displacement time series data into trend term and periodic term displacement subsequences. A cubic polynomial function is used to predict the trend term displacement subsequence, and the SLSTM neural network is used to predict the periodic term displacement subsequence. At the same time, the Bayesian optimization algorithm is used to determine that the SLSTM network input sequence length is 12 and the number of hidden layer nodes is 18. The SLSTM network is updated by adding predicted values to the training set to achieve dynamic displacement prediction. Finally, the accumulated landslide displacement is obtained by superimposing the predicted value of each displacement subsequence. The proposed model was verified on the Xintan landslide in Hubei Province, China. The results show that when predicting the displacement of the periodic term, the SLSTM network has higher prediction accuracy than the support vector machine(SVM) and auto regressive integrated moving average(ARIMA). The mean relative error(MRE) is reduced by 4.099% and 3.548% respectively, while the root mean square error(RMSE) is reduced by 5.830 mm and 3.854 mm respectively. It is concluded that the SLSTM network model can better simulate the dynamic characteristics of landslides.
文摘There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement and time series of a landslide.The second one is the dynamic evolution of a landslide,which could not be feasibly simulated simply by traditional prediction models.In this paper,a dynamic model of displacement prediction is introduced for composite landslides based on a combination of empirical mode decomposition with soft screening stop criteria(SSSC-EMD)and deep bidirectional long short-term memory(DBi-LSTM)neural network.In the proposed model,the time series analysis and SSSC-EMD are used to decompose the observed accumulated displacements of a slope into three components,viz.trend displacement,periodic displacement,and random displacement.Then,by analyzing the evolution pattern of a landslide and its key factors triggering landslides,appropriate influencing factors are selected for each displacement component,and DBi-LSTM neural network to carry out multi-datadriven dynamic prediction for each displacement component.An accumulated displacement prediction has been obtained by a summation of each component.For accuracy verification and engineering practicability of the model,field observations from two known landslides in China,the Xintan landslide and the Bazimen landslide were collected for comparison and evaluation.The case study verified that the model proposed in this paper can better characterize the"stepwise"deformation characteristics of a slope.As compared with long short-term memory(LSTM)neural network,support vector machine(SVM),and autoregressive integrated moving average(ARIMA)model,DBi-LSTM neural network has higher accuracy in predicting the periodic displacement of slope deformation,with the mean absolute percentage error reduced by 3.063%,14.913%,and 13.960%respectively,and the root mean square error reduced by 1.951 mm,8.954 mm and 7.790 mm respectively.Conclusively,this model not only has high prediction accuracy but also is more stable,which can provide new insight for practical landslide prevention and control engineering.
文摘Holter usually monitors electrocardiogram(ECG)signals for more than 24 hours to capture short-lived cardiac abnormalities.In view of the large amount of Holter data and the fact that the normal part accounts for the majority,it is reasonable to design an algorithm that can automatically eliminate normal data segments as much as possible without missing any abnormal data segments,and then take the left segments to the doctors or the computer programs for further diagnosis.In this paper,we propose a preliminary abnormal segment screening method for Holter data.Based on long short-term memory(LSTM)networks,the prediction model is established and trained with the normal data of a monitored object.Then,on the basis of kernel density estimation,we learn the distribution law of prediction errors after applying the trained LSTM model to the regular data.Based on these,the preliminary abnormal ECG segment screening analysis is carried out without R wave detection.Experiments on the MIT-BIH arrhythmia database show that,under the condition of ensuring that no abnormal point is missed,53.89% of normal segments can be effectively obviated.This work can greatly reduce the workload of subsequent further processing.
基金financially supported by the National Natural Science Foundation of China (Grant Nos. 52074258, 41941018, and U21A20153)
文摘Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented.Firstly,a function excluding invalid and abnormal data is established to distinguish TBM operating state,and a feature selection method based on the SelectKBest algorithm is proposed.Accordingly,ten features that are most closely related to the cutter-head torque are selected as input variables,which,in descending order of influence,include the sum of motor torque,cutter-head power,sum of motor power,sum of motor current,advance rate,cutter-head pressure,total thrust force,penetration rate,cutter-head rotational velocity,and field penetration index.Secondly,a real-time cutterhead torque prediction model’s structure is developed,based on the bidirectional long short-term memory(BLSTM)network integrating the dropout algorithm to prevent overfitting.Then,an algorithm to optimize hyperparameters of model based on Bayesian and cross-validation is proposed.Early stopping and checkpoint algorithms are integrated to optimize the training process.Finally,a BLSTMbased real-time cutter-head torque prediction model is developed,which fully utilizes the previous time-series tunneling information.The mean absolute percentage error(MAPE)of the model in the verification section is 7.3%,implying that the presented model is suitable for real-time cutter-head torque prediction.Furthermore,an incremental learning method based on the above base model is introduced to improve the adaptability of the model during the TBM tunneling.Comparison of the prediction performance between the base and incremental learning models in the same tunneling section shows that:(1)the MAPE of the predicted results of the BLSTM-based real-time cutter-head torque prediction model remains below 10%,and both the coefficient of determination(R^(2))and correlation coefficient(r)between measured and predicted values exceed 0.95;and(2)the incremental learning method is suitable for realtime cutter-head torque prediction and can effectively improve the prediction accuracy and generalization capacity of the model during the excavation process.
基金the Special Program for Innovation Methodology of the Ministry of Science and Technology of China(No.2016IM010100)。
文摘Load forecasting can increase the efficiency of modern energy systems with built-in measuring systerms by providing a more accurate peak power shaving performance and thus more reliable control.An analysis of an integrated CO2 heat pump and chiller system with a hot water storage system is presented in this paper.Drastic power fluctuations,which can be reduced with load forecasting,are found in historical operation records.A model that aims to forecast the ventilation system heating demand is thus established on the basis of a long short-term memory(LSTM)network.The model can successfully forecast the one hour ahead power using records of the past 48h of the system operation data and the ambient temperature.The mean absolute percentage error(MAPE)of the forecast results of the LSTM-based model is 10.70%,which is respectively 2.2%and 7.25%better than the MAPEs of the forecast results of the support vector regression based and persistence method based models.
基金supported by the National Major Science and Technology Special Project(No.2016ZX05026-002).
文摘In this paper,the recurrent neural network structure of a bidirectional long shortterm memory network(Bi-LSTM)with special memory cells that store information is used to characterize the deep features of the variation pattern between logging and seismic data.A mapping relationship model between high-frequency logging data and low-frequency seismic data is established via nonlinear mapping.The seismic waveform is infinitely approximated using the logging curve in the low-frequency band to obtain a nonlinear mapping model of this scale,which then stepwise approach the logging curve in the high-frequency band.Finally,a seismic-inversion method of nonlinear mapping multilevel well–seismic matching based on the Bi-LSTM network is developed.The characteristic of this method is that by applying the multilevel well–seismic matching process,the seismic data are stepwise matched to the scale range that is consistent with the logging curve.Further,the matching operator at each level can be stably obtained to effectively overcome the problems that occur in the well–seismic matching process,such as the inconsistency in the scale of two types of data,accuracy in extracting the seismic wavelet of the well-side seismic traces,and multiplicity of solutions.Model test and practical application demonstrate that this method improves the vertical resolution of inversion results,and at the same time,the boundary and the lateral characteristics of the sand body are well maintained to improve the accuracy of thin-layer sand body prediction and achieve an improved practical application effect.
基金National Key R&D Program of China(No.2020YFB1707700)。
文摘The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.
基金supported by the National Natural Science Foundation of China(62371404,62271425,62071401).
文摘To address the problem of underwater sound speed profile(SSP)inversion in underwater acoustic multipath channels,this paper combines deep learning and ray theory to propose an inversion method using a long short-term memory(LSTM)network.Based on the equidistant characteristics of the horizontal line array,the proposed method takes the sensing matrix composed of multi-modal data,such as time difference of arrival and angle of arrival,as input,and utilizes the ability of the LSTM network to process timeseries data to mine the correlations between spatially ordered receiving array elements for sound speed profile inversion.On this basis,a time delay estimation method based on hard threshold estimation method and cross-correlation function is proposed to reduce the measurement errors of the sensing matrix and improve the anti-multipath performance.The feasibility and accuracy of the proposed method are verified through numerical simulations.Compared with the traditional optimization algorithm,the proposed algorithm better captures the nonlinear characteristics of SSP,with higher inversion accuracy and stronger noise resistance.
基金This work is supported by the National Key Research and Development Program of China(No.2023YFB4203000)the National Natural Science Foundation of China(No.U22A20178)
文摘Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,we applied machine learning techniques to obtain hydrodynamic and aerodynamic loads of FOWTs by measuring platform motion responses and wave-elevation sequences.First,a computational fluid dynamics(CFD)simulation model of the floating platform was established based on the dynamic fluid body interaction technique and overset grid technology.Then,a long short-term memory(LSTM)neural network model was constructed and trained to learn the nonlinear relationship between the waves,platform-motion inputs,and hydrodynamic-load outputs.The optimal model was determined after analyzing the sensitivity of parameters such as sample characteristics,network layers,and neuron numbers.Subsequently,the effectiveness of the hydrodynamic load model was validated under different simulation conditions,and the aerodynamic load calculation was completed based on the D'Alembert principle.Finally,we built a hybrid-scale FOWT model,based on the software in the loop strategy,in which the wind turbine was replaced by an actuation system.Model tests were carried out in a wave basin and the results demonstrated that the root mean square errors of the hydrodynamic and aerodynamic load measurements were 4.20%and 10.68%,respectively.
基金supported by the National Research and Development Program(2022YFC3004603)the Jiangsu Province International Collaboration Program-Key National Industrial Technology Research and Development Cooperation Projects(BZ2023050)+1 种基金the Natural Science Foundation of Jiangsu Province(BK20221109)the National Natural Science Foundation of China(52274098).
文摘The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster prediction.To address the issue of insufficient exploration of the spatio-temporal characteristic of microseismic data and the challenging selection of the optimal time window size in spatio-temporal prediction,this paper integrates deep learning methods and theory to propose a novel coal burst spatio-temporal prediction method based on Bidirectional Long Short-Term Memory(Bi-LSTM)network.The method involves three main modules,including microseismic spatio-temporal characteristic indicators construction,temporal prediction model,and spatial prediction model.To validate the effectiveness of the proposed method,engineering application tests are conducted at a high-risk working face in the Ordos mining area of Inner Mongolia,focusing on 13 high-energy microseismic events with energy levels greater than 105 J.In terms of temporal prediction,the analysis indicates that the temporal prediction results consist of 10 strong predictions and 3 medium predictions,and there is no false alarm detected throughout the entire testing period.Moreover,compared to the traditional threshold-based coal burst temporal prediction method,the accuracy of the proposed method is increased by 38.5%.In terms of spatial prediction,the distribution of spatial prediction results for high-energy events comprises 6 strong hazard predictions,3 medium hazard predictions,and 4 weak hazard predictions.
文摘Surface EMG contains a lot of physiological information reflecting the intention of human movement.Gesture recognition by surface EMG has been widely concerned in the field of human-computer interaction and rehabilitation.At present,most studies on gesture recognition based on surface EMG signal are obtained by discrete separation method,ignoring continuous natural motion.A gesture recognition method of surface EMG based on improved long short-term memory network is proposed.sEMG sensors are rationally arranged according to physiological structure and muscle function.In this paper,the finger curvature is used to describe the gesture state,and the gesture at every moment can be represented by the set of different finger curvature,so as to realize continuous gesture recognition.Finally,the proposed gesture recognition model is tested on Ninapro(a large gesture recognition database).The results show that the proposed method can effectively improve the representation mining ability of surface EMG signal,and provide reference for deep learning modeling of human gesture recognition.
基金supported in part by the National Natural Science Foundation of China(No.U1910216)in part by the Science and Technology Project of China Southern Power Grid Company Limited(No.080037KK52190039/GZHKJXM20190100)。
文摘Joint operation optimization for electric vehicles(EVs)and on-site or adjacent photovoltaic generation(PVG)are pivotal to maintaining the security and economics of the operation of the power system concerned.Conventional offline optimization algorithms lack real-time applicability due to uncertainties involved in the charging service of an EV charging station(EVCS).Firstly,an optimization model for real-time EV charging strategy is proposed to address these challenges,which accounts for environmental uncertainties of an EVCS,encompassing EV arrivals,charging demands,PVG outputs,and the electricity price.Then,a scenario-based two-stage optimization approach is formulated.The scenarios of the underlying uncertain environmental factors are generated by the Bayesian long short-term memory(B-LSTM)network.Finally,numerical results substantiate the efficacy of the proposed optimization approach,and demonstrate superior profitability compared with prevalent approaches.
基金supported in part by the Guangdong Basic and Applied Basic Research Foundation under Grand No.2020A1515111100in part by the National Natural Science Foundation of China under Grant 52207106in part the Young Elite Scientists Sponsorship Program by CSEE under Grant CSEE-YESS-2022019.
文摘This paper presents a long short-term memory(LSTM)-based fault detection method to detect the multiple open-circuit switch faults of modular multilevel converter(MMC)systems with full-bridge sub-modules(FB-SMs).Eighteen sensor signals of grid voltages,grid currents and capacitance voltages of MMC for single and multi-switch faults are collected as sampling data.The output signal characteristics of four types of single switch faults of FB-SM,as well as double switch faults in the same and different phases of MMC,are analyzed under the conditions of load variations and control command changes.A multi-layer LSTM network is devised to deeply extract the fault characteristics of MMC under different faults and operation conditions,and a Softmax layer detects the fault types.Simulation results have confirmed that the proposed LSTM-based method has better detection performance compared with three other methods:K-nearest neighbor(KNN),naive bayes(NB)and recurrent neural network(RNN).In addition,it is highly robust to model uncertainties and Gaussian noise.The validity of the proposed method is further demonstrated by experiment studies conducted on a hardware-in-the-loop(HIL)testing platform.
基金partially supported by the National Key R&D Program of China (2022YFE0133700)the National Natural Science Foundation of China(12273007)+4 种基金the Guizhou Provincial Excellent Young Science and Technology Talent Program (YQK[2023]006)the National SKA Program of China (2020SKA0110300)the National Natural Science Foundation of China(11963003)the Guizhou Provincial Basic Research Program (Natural Science)(ZK[2022]143)the Cultivation project of Guizhou University ([2020]76).
文摘Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causing magnetic storms.Consequently,it is very important to accurately predict the time period of solar flares.This paper proposes a flare prediction model,based on physical images of active solar regions.We employ X-ray flux curves recorded directly by the Geostationary Operational Environmental Satellite,used as input data for the model,allowing us to largely avoid the influence of accidental errors,effectively improving the model prediction efficiency.A model based on the X-ray flux curve can predict whether there will be a flare event within 24 hours.The reverse can also be verified by the peak of the X-ray flux curve to see if a flare has occurred within the past 24 hours.The True Positive Rate and False Positive Rate of the prediction model,based on physical images of active regions are 0.6070 and 0.2410 respectively,and the accuracy and True Skill Statistics are 0.7590 and 0.5556.Our model can effectively improve prediction efficiency compared with models based on the physical parameters of active regions or magnetic field records,providing a simple method for solar flare prediction.
文摘Developing efficient neural network(NN)computing systems is crucial in the era of artificial intelligence(AI).Traditional von Neumann architectures have both the issues of"memory wall"and"power wall",limiting the data transfer between memory and processing units[1,2].Compute-in-memory(CIM)technologies,particularly analogue CIM with memristor crossbars,are promising because of their high energy efficiency,computational parallelism,and integration density for NN computations[3].In practical applications,analogue CIM excels in tasks like speech recognition and image classification,revealing its unique advantages.For instance,it efficiently processes vast amounts of audio data in speech recognition,achieving high accuracy with minimal power consumption.In image classification,the high parallelism of analogue CIM significantly speeds up feature extraction and reduces processing time.With the boosting development of AI applications,the demands for computational accuracy and task complexity are rising continually.However,analogue CIM systems are limited in handling complex regression tasks with needs of precise floating-point(FP)calculations.They are primarily suited for the classification tasks with low data precision and a limited dynamic range[4].
基金supported by the National Key Research and Development Program of China(No.2021YFB2600300).
文摘Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict key performance indicators(PIs)of pavement,namely the international roughness index(IRI)and rutting depth(RD).Subsequently,we propose a comprehensive performance indicator for the pavement quality index(PQI),which leverages the highway performance assessment standard method,entropy weight method,and fuzzy comprehensive evaluation method.This indicator can evaluate the overall performance condition of the pavement.The data used for the model development and analysis are extracted from tests on two full-scale accelerated test tracks,called MnRoad and RIOHTrack.Six variables are used as predictors,including temperature,precipitation,total traffic volume,asphalt surface layer thickness,pavement age,and maintenance condition.Furthermore,wavelet denoising is performed to analyze the impact of missing or abnormal data on the LSTM model accuracy.In comparison to a traditional autoregressive integrated moving average(ARIMAX)model,the proposed LSTM model performs better in terms of PI prediction and resiliency to noise.Finally,the overall prediction accuracy of our proposed performance indicator PQI is 93.8%.