Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The ma...Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The main objective of our work is to predict the market performance of the Dhaka Stock Exchange (DSE) on day closing price using different Deep Learning techniques. In this study, we have used the LSTM (Long Short-Term Memory) network to forecast the data of DSE for the convenience of shareholders. We have enforced LSTM networks to train data as well as forecast the future time series that has differentiated with test data. We have computed the Root Mean Square Error (RMSE) value to scrutinize the error between the forecasted value and test data that diminished the error by updating the LSTM networks. As a consequence of the renovation of the network, the LSTM network provides tremendous performance which outperformed the existing works to predict stock market prices.展开更多
Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causin...Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causing magnetic storms.Consequently,it is very important to accurately predict the time period of solar flares.This paper proposes a flare prediction model,based on physical images of active solar regions.We employ X-ray flux curves recorded directly by the Geostationary Operational Environmental Satellite,used as input data for the model,allowing us to largely avoid the influence of accidental errors,effectively improving the model prediction efficiency.A model based on the X-ray flux curve can predict whether there will be a flare event within 24 hours.The reverse can also be verified by the peak of the X-ray flux curve to see if a flare has occurred within the past 24 hours.The True Positive Rate and False Positive Rate of the prediction model,based on physical images of active regions are 0.6070 and 0.2410 respectively,and the accuracy and True Skill Statistics are 0.7590 and 0.5556.Our model can effectively improve prediction efficiency compared with models based on the physical parameters of active regions or magnetic field records,providing a simple method for solar flare prediction.展开更多
The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through acceler...The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through accelerated life testing.In the absence of lifetime data,the hidden long-term correlation between performance degradation data is challenging to mine effectively,which is the main factor that restricts the prediction precision and engineering application of the residual life prediction method.To address this problem,a novel method based on the multi-layer perception neural network and bidirectional long short-term memory network is proposed.Firstly,a nonlinear health indicator(HI)calculation method based on kernel principal component analysis(KPCA)and exponential weighted moving average(EWMA)is designed.Then,using the raw vibration data and HI,a multi-layer perceptron(MLP)neural network is trained to further calculate the HI of the online bearing in real time.Furthermore,The bidirectional long short-term memory model(BiLSTM)optimized by particle swarm optimization(PSO)is used to mine the time series features of HI and predict the remaining service life.Performance verification experiments and comparative experiments are carried out on the XJTU-SY bearing open dataset.The research results indicate that this method has an excellent ability to predict future HI and remaining life.展开更多
Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,w...Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,we applied machine learning techniques to obtain hydrodynamic and aerodynamic loads of FOWTs by measuring platform motion responses and wave-elevation sequences.First,a computational fluid dynamics(CFD)simulation model of the floating platform was established based on the dynamic fluid body interaction technique and overset grid technology.Then,a long short-term memory(LSTM)neural network model was constructed and trained to learn the nonlinear relationship between the waves,platform-motion inputs,and hydrodynamic-load outputs.The optimal model was determined after analyzing the sensitivity of parameters such as sample characteristics,network layers,and neuron numbers.Subsequently,the effectiveness of the hydrodynamic load model was validated under different simulation conditions,and the aerodynamic load calculation was completed based on the D'Alembert principle.Finally,we built a hybrid-scale FOWT model,based on the software in the loop strategy,in which the wind turbine was replaced by an actuation system.Model tests were carried out in a wave basin and the results demonstrated that the root mean square errors of the hydrodynamic and aerodynamic load measurements were 4.20%and 10.68%,respectively.展开更多
Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict...Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict key performance indicators(PIs)of pavement,namely the international roughness index(IRI)and rutting depth(RD).Subsequently,we propose a comprehensive performance indicator for the pavement quality index(PQI),which leverages the highway performance assessment standard method,entropy weight method,and fuzzy comprehensive evaluation method.This indicator can evaluate the overall performance condition of the pavement.The data used for the model development and analysis are extracted from tests on two full-scale accelerated test tracks,called MnRoad and RIOHTrack.Six variables are used as predictors,including temperature,precipitation,total traffic volume,asphalt surface layer thickness,pavement age,and maintenance condition.Furthermore,wavelet denoising is performed to analyze the impact of missing or abnormal data on the LSTM model accuracy.In comparison to a traditional autoregressive integrated moving average(ARIMAX)model,the proposed LSTM model performs better in terms of PI prediction and resiliency to noise.Finally,the overall prediction accuracy of our proposed performance indicator PQI is 93.8%.展开更多
This research focuses on solving the fault detection and health monitoring of high-power thyristor converter.In terms of the critical role of thyristor converter in nuclear fusion system,a method based on long short-t...This research focuses on solving the fault detection and health monitoring of high-power thyristor converter.In terms of the critical role of thyristor converter in nuclear fusion system,a method based on long short-term memory(LSTM)neural network model is proposed to monitor the operational state of the converter and accurately detect faults as they occur.By sampling and processing a large number of thyristor converter operation data,the LSTM model is trained to identify and detect abnormal state,and the power supply health status is monitored.Compared with traditional methods,LSTM model shows higher accuracy and abnormal state detection ability.The experimental results show that this method can effectively improve the reliability and safety of the thyristor converter,and provide a strong guarantee for the stable operation of the nuclear fusion reactor.展开更多
A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan....A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan. In addition, there is still a lack of tailored health estimations for fast-charging batteries;most existing methods are applicable at lower charging rates. This paper proposes a novel method for estimating the health of lithium-ion batteries, which is tailored for multi-stage constant current-constant voltage fast-charging policies. Initially, short charging segments are extracted by monitoring current switches,followed by deriving voltage sequences using interpolation techniques. Subsequently, a graph generation layer is used to transform the voltage sequence into graphical data. Furthermore, the integration of a graph convolution network with a long short-term memory network enables the extraction of information related to inter-node message transmission, capturing the key local and temporal features during the battery degradation process. Finally, this method is confirmed by utilizing aging data from 185 cells and 81 distinct fast-charging policies. The 4-minute charging duration achieves a balance between high accuracy in estimating battery state of health and low data requirements, with mean absolute errors and root mean square errors of 0.34% and 0.66%, respectively.展开更多
With the advancement of artificial intelligence,traffic forecasting is gaining more and more interest in optimizing route planning and enhancing service quality.Traffic volume is an influential parameter for planning ...With the advancement of artificial intelligence,traffic forecasting is gaining more and more interest in optimizing route planning and enhancing service quality.Traffic volume is an influential parameter for planning and operating traffic structures.This study proposed an improved ensemble-based deep learning method to solve traffic volume prediction problems.A set of optimal hyperparameters is also applied for the suggested approach to improve the performance of the learning process.The fusion of these methodologies aims to harness ensemble empirical mode decomposition’s capacity to discern complex traffic patterns and long short-term memory’s proficiency in learning temporal relationships.Firstly,a dataset for automatic vehicle identification is obtained and utilized in the preprocessing stage of the ensemble empirical mode decomposition model.The second aspect involves predicting traffic volume using the long short-term memory algorithm.Next,the study employs a trial-and-error approach to select a set of optimal hyperparameters,including the lookback window,the number of neurons in the hidden layers,and the gradient descent optimization.Finally,the fusion of the obtained results leads to a final traffic volume prediction.The experimental results show that the proposed method outperforms other benchmarks regarding various evaluation measures,including mean absolute error,root mean squared error,mean absolute percentage error,and R-squared.The achieved R-squared value reaches an impressive 98%,while the other evaluation indices surpass the competing.These findings highlight the accuracy of traffic pattern prediction.Consequently,this offers promising prospects for enhancing transportation management systems and urban infrastructure planning.展开更多
The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster predic...The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster prediction.To address the issue of insufficient exploration of the spatio-temporal characteristic of microseismic data and the challenging selection of the optimal time window size in spatio-temporal prediction,this paper integrates deep learning methods and theory to propose a novel coal burst spatio-temporal prediction method based on Bidirectional Long Short-Term Memory(Bi-LSTM)network.The method involves three main modules,including microseismic spatio-temporal characteristic indicators construction,temporal prediction model,and spatial prediction model.To validate the effectiveness of the proposed method,engineering application tests are conducted at a high-risk working face in the Ordos mining area of Inner Mongolia,focusing on 13 high-energy microseismic events with energy levels greater than 105 J.In terms of temporal prediction,the analysis indicates that the temporal prediction results consist of 10 strong predictions and 3 medium predictions,and there is no false alarm detected throughout the entire testing period.Moreover,compared to the traditional threshold-based coal burst temporal prediction method,the accuracy of the proposed method is increased by 38.5%.In terms of spatial prediction,the distribution of spatial prediction results for high-energy events comprises 6 strong hazard predictions,3 medium hazard predictions,and 4 weak hazard predictions.展开更多
Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections an...Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections and geologic disposal of nuclear waste.Such activities are expected to rise in the future making it necessary to assess their short-and long-term safety.Here,a new machine learning(ML)approach to model pore pressure and fault displacements in response to high-pressure fluid injection cycles is developed.The focus is on fault behavior near the injection borehole.To capture the temporal dependencies in the data,long short-term memory(LSTM)networks are utilized.To prevent error accumulation within the forecast window,four critical measures to train a robust LSTM model for predicting fault response are highlighted:(i)setting an appropriate value of LSTM lag,(ii)calibrating the LSTM cell dimension,(iii)learning rate reduction during weight optimization,and(iv)not adopting an independent injection cycle as a validation set.Several numerical experiments were conducted,which demonstrated that the ML model can capture peaks in pressure and associated fault displacement that accompany an increase in fluid injection.The model also captured the decay in pressure and displacement during the injection shut-in period.Further,the ability of an ML model to highlight key changes in fault hydromechanical activation processes was investigated,which shows that ML can be used to monitor risk of fault activation and leakage during high pressure fluid injections.展开更多
Breast cancer is a significant threat to the global population,affecting not only women but also a threat to the entire population.With recent advancements in digital pathology,Eosin and hematoxylin images provide enh...Breast cancer is a significant threat to the global population,affecting not only women but also a threat to the entire population.With recent advancements in digital pathology,Eosin and hematoxylin images provide enhanced clarity in examiningmicroscopic features of breast tissues based on their staining properties.Early cancer detection facilitates the quickening of the therapeutic process,thereby increasing survival rates.The analysis made by medical professionals,especially pathologists,is time-consuming and challenging,and there arises a need for automated breast cancer detection systems.The upcoming artificial intelligence platforms,especially deep learning models,play an important role in image diagnosis and prediction.Initially,the histopathology biopsy images are taken from standard data sources.Further,the gathered images are given as input to the Multi-Scale Dilated Vision Transformer,where the essential features are acquired.Subsequently,the features are subjected to the Bidirectional Long Short-Term Memory(Bi-LSTM)for classifying the breast cancer disorder.The efficacy of the model is evaluated using divergent metrics.When compared with other methods,the proposed work reveals that it offers impressive results for detection.展开更多
Wind power generation is among the most promising and eco-friendly energy sources today. Wind Power Forecasting (WPF) is essential for boosting energy efficiency and maintaining the operational stability of power grid...Wind power generation is among the most promising and eco-friendly energy sources today. Wind Power Forecasting (WPF) is essential for boosting energy efficiency and maintaining the operational stability of power grids. However, predicting wind power comes with significant challenges, such as weather uncertainties, wind variability, complex terrain, limited data, insufficient measurement infrastructure, intricate interdependencies, and short lead times. These factors make it difficult to accurately forecast wind behavior and respond to sudden power output changes. This study aims to precisely forecast electricity generation from wind turbines, minimize grid operation uncertainties, and enhance grid reliability. It leverages historical wind farm data and Numerical Weather Prediction data, using k-Nearest Neighbors for pre-processing, K-means clustering for categorization, and Long Short-Term Memory (LSTM) networks for training and testing, with model performance evaluated across multiple metrics. The Grey Wolf Optimized (GWO) LSTM classification technique, a deep learning model suited to time series analysis, effectively handles temporal dependencies in input data through memory cells and gradient-based optimization. Inspired by grey wolves’ hunting strategies, GWO is a population-based metaheuristic optimization algorithm known for its strong performance across diverse optimization tasks. The proposed Grey Wolf Optimized Deep Learning model achieves an R-squared value of 0.97279, demonstrating that it explains 97.28% of the variance in wind power data. This model surpasses a reference study that achieved an R-squared value of 0.92 with a hybrid deep learning approach but did not account for outliers or anomalous data.展开更多
Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented...Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented.Firstly,a function excluding invalid and abnormal data is established to distinguish TBM operating state,and a feature selection method based on the SelectKBest algorithm is proposed.Accordingly,ten features that are most closely related to the cutter-head torque are selected as input variables,which,in descending order of influence,include the sum of motor torque,cutter-head power,sum of motor power,sum of motor current,advance rate,cutter-head pressure,total thrust force,penetration rate,cutter-head rotational velocity,and field penetration index.Secondly,a real-time cutterhead torque prediction model’s structure is developed,based on the bidirectional long short-term memory(BLSTM)network integrating the dropout algorithm to prevent overfitting.Then,an algorithm to optimize hyperparameters of model based on Bayesian and cross-validation is proposed.Early stopping and checkpoint algorithms are integrated to optimize the training process.Finally,a BLSTMbased real-time cutter-head torque prediction model is developed,which fully utilizes the previous time-series tunneling information.The mean absolute percentage error(MAPE)of the model in the verification section is 7.3%,implying that the presented model is suitable for real-time cutter-head torque prediction.Furthermore,an incremental learning method based on the above base model is introduced to improve the adaptability of the model during the TBM tunneling.Comparison of the prediction performance between the base and incremental learning models in the same tunneling section shows that:(1)the MAPE of the predicted results of the BLSTM-based real-time cutter-head torque prediction model remains below 10%,and both the coefficient of determination(R^(2))and correlation coefficient(r)between measured and predicted values exceed 0.95;and(2)the incremental learning method is suitable for realtime cutter-head torque prediction and can effectively improve the prediction accuracy and generalization capacity of the model during the excavation process.展开更多
The fraction defective of semi-finished products is predicted to optimize the process of relay production lines, by which production quality and productivity are increased, and the costs are decreased. The process par...The fraction defective of semi-finished products is predicted to optimize the process of relay production lines, by which production quality and productivity are increased, and the costs are decreased. The process parameters of relay production lines are studied based on the long-and-short-term memory network. Then, the Keras deep learning framework is utilized to build up a short-term relay quality prediction algorithm for the semi-finished product. A simulation model is used to study prediction algorithm. The simulation results show that the average prediction absolute error of the fraction is less than 5%. This work displays great application potential in the relay production lines.展开更多
The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-...The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.展开更多
A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a force...A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a forced aligned Hidden Markov Model(HMM) state sequence obtained from the GMM-based acoustic model. Therefore, it requires a long computation time for training both the GMM-based acoustic model and a deep learning-based acoustic model. In order to solve this problem, an acoustic model using CTC algorithm is proposed. CTC algorithm does not require the GMM-based acoustic model because it does not use the forced aligned HMM state sequence. However, previous works on a LSTM RNN-based acoustic model using CTC used a small-scale training corpus. In this paper, the LSTM RNN-based acoustic model using CTC is trained on a large-scale training corpus and its performance is evaluated. The implemented acoustic model has a performance of 6.18% and 15.01% in terms of Word Error Rate(WER) for clean speech and noisy speech, respectively. This is similar to a performance of the acoustic model based on the hybrid method.展开更多
A correct and timely fault diagnosis is important for improving the safety and reliability of chemical processes. With the advancement of big data technology, data-driven fault diagnosis methods are being extensively ...A correct and timely fault diagnosis is important for improving the safety and reliability of chemical processes. With the advancement of big data technology, data-driven fault diagnosis methods are being extensively used and still have considerable potential. In recent years, methods based on deep neural networks have made significant breakthroughs, and fault diagnosis methods for industrial processes based on deep learning have attracted considerable research attention. Therefore, we propose a fusion deeplearning algorithm based on a fully convolutional neural network(FCN) to extract features and build models to correctly diagnose all types of faults. We use long short-term memory(LSTM) units to expand our proposed FCN so that our proposed deep learning model can better extract the time-domain features of chemical process data. We also introduce the attention mechanism into the model, aimed at highlighting the importance of features, which is significant for the fault diagnosis of chemical processes with many features. When applied to the benchmark Tennessee Eastman process, our proposed model exhibits impressive performance, demonstrating the effectiveness of the attention-based LSTM FCN in chemical process fault diagnosis.展开更多
To explore new operational forecasting methods of waves,a forecasting model for wave heights at three stations in the Bohai Sea has been developed.This model is based on long short-term memory(LSTM)neural network with...To explore new operational forecasting methods of waves,a forecasting model for wave heights at three stations in the Bohai Sea has been developed.This model is based on long short-term memory(LSTM)neural network with sea surface wind and wave heights as training samples.The prediction performance of the model is evaluated,and the error analysis shows that when using the same set of numerically predicted sea surface wind as input,the prediction error produced by the proposed LSTM model at Sta.N01 is 20%,18%and 23%lower than the conventional numerical wave models in terms of the total root mean square error(RMSE),scatter index(SI)and mean absolute error(MAE),respectively.Particularly,for significant wave height in the range of 3–5 m,the prediction accuracy of the LSTM model is improved the most remarkably,with RMSE,SI and MAE all decreasing by 24%.It is also evident that the numbers of hidden neurons,the numbers of buoys used and the time length of training samples all have impact on the prediction accuracy.However,the prediction does not necessary improve with the increase of number of hidden neurons or number of buoys used.The experiment trained by data with the longest time length is found to perform the best overall compared to other experiments with a shorter time length for training.Overall,long short-term memory neural network was proved to be a very promising method for future development and applications in wave forecasting.展开更多
The numerical simulation and slope stability prediction are the focus of slope disaster research.Recently,machine learning models are commonly used in the slope stability prediction.However,these machine learning mode...The numerical simulation and slope stability prediction are the focus of slope disaster research.Recently,machine learning models are commonly used in the slope stability prediction.However,these machine learning models have some problems,such as poor nonlinear performance,local optimum and incomplete factors feature extraction.These issues can affect the accuracy of slope stability prediction.Therefore,a deep learning algorithm called Long short-term memory(LSTM)has been innovatively proposed to predict slope stability.Taking the Ganzhou City in China as the study area,the landslide inventory and their characteristics of geotechnical parameters,slope height and slope angle are analyzed.Based on these characteristics,typical soil slopes are constructed using the Geo-Studio software.Five control factors affecting slope stability,including slope height,slope angle,internal friction angle,cohesion and volumetric weight,are selected to form different slope and construct model input variables.Then,the limit equilibrium method is used to calculate the stability coefficients of these typical soil slopes under different control factors.Each slope stability coefficient and its corresponding control factors is a slope sample.As a result,a total of 2160 training samples and 450 testing samples are constructed.These sample sets are imported into LSTM for modelling and compared with the support vector machine(SVM),random forest(RF)and convo-lutional neural network(CNN).The results show that the LSTM overcomes the problem that the commonly used machine learning models have difficulty extracting global features.Furthermore,LSTM has a better prediction performance for slope stability compared to SVM,RF and CNN models.展开更多
文摘Deep learning plays a vital role in real-life applications, for example object identification, human face recognition, speech recognition, biometrics identification, and short and long-term forecasting of data. The main objective of our work is to predict the market performance of the Dhaka Stock Exchange (DSE) on day closing price using different Deep Learning techniques. In this study, we have used the LSTM (Long Short-Term Memory) network to forecast the data of DSE for the convenience of shareholders. We have enforced LSTM networks to train data as well as forecast the future time series that has differentiated with test data. We have computed the Root Mean Square Error (RMSE) value to scrutinize the error between the forecasted value and test data that diminished the error by updating the LSTM networks. As a consequence of the renovation of the network, the LSTM network provides tremendous performance which outperformed the existing works to predict stock market prices.
基金partially supported by the National Key R&D Program of China (2022YFE0133700)the National Natural Science Foundation of China(12273007)+4 种基金the Guizhou Provincial Excellent Young Science and Technology Talent Program (YQK[2023]006)the National SKA Program of China (2020SKA0110300)the National Natural Science Foundation of China(11963003)the Guizhou Provincial Basic Research Program (Natural Science)(ZK[2022]143)the Cultivation project of Guizhou University ([2020]76).
文摘Solar flares are violent solar outbursts which have a great influence on the space environment surrounding Earth,potentially causing disruption of the ionosphere and interference with the geomagnetic field,thus causing magnetic storms.Consequently,it is very important to accurately predict the time period of solar flares.This paper proposes a flare prediction model,based on physical images of active solar regions.We employ X-ray flux curves recorded directly by the Geostationary Operational Environmental Satellite,used as input data for the model,allowing us to largely avoid the influence of accidental errors,effectively improving the model prediction efficiency.A model based on the X-ray flux curve can predict whether there will be a flare event within 24 hours.The reverse can also be verified by the peak of the X-ray flux curve to see if a flare has occurred within the past 24 hours.The True Positive Rate and False Positive Rate of the prediction model,based on physical images of active regions are 0.6070 and 0.2410 respectively,and the accuracy and True Skill Statistics are 0.7590 and 0.5556.Our model can effectively improve prediction efficiency compared with models based on the physical parameters of active regions or magnetic field records,providing a simple method for solar flare prediction.
基金supported by the National Key Research and Development Project(Grant Number 2023YFB3709601)the National Natural Science Foundation of China(Grant Numbers 62373215,62373219,62073193)+2 种基金the Key Research and Development Plan of Shandong Province(Grant Numbers 2021CXGC010204,2022CXGC020902)the Fundamental Research Funds of Shandong University(Grant Number 2021JCG008)the Natural Science Foundation of Shandong Province(Grant Number ZR2023MF100).
文摘The remaining useful life prediction of rolling bearing is vital in safety and reliability guarantee.In engineering scenarios,only a small amount of bearing performance degradation data can be obtained through accelerated life testing.In the absence of lifetime data,the hidden long-term correlation between performance degradation data is challenging to mine effectively,which is the main factor that restricts the prediction precision and engineering application of the residual life prediction method.To address this problem,a novel method based on the multi-layer perception neural network and bidirectional long short-term memory network is proposed.Firstly,a nonlinear health indicator(HI)calculation method based on kernel principal component analysis(KPCA)and exponential weighted moving average(EWMA)is designed.Then,using the raw vibration data and HI,a multi-layer perceptron(MLP)neural network is trained to further calculate the HI of the online bearing in real time.Furthermore,The bidirectional long short-term memory model(BiLSTM)optimized by particle swarm optimization(PSO)is used to mine the time series features of HI and predict the remaining service life.Performance verification experiments and comparative experiments are carried out on the XJTU-SY bearing open dataset.The research results indicate that this method has an excellent ability to predict future HI and remaining life.
基金This work is supported by the National Key Research and Development Program of China(No.2023YFB4203000)the National Natural Science Foundation of China(No.U22A20178)
文摘Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,we applied machine learning techniques to obtain hydrodynamic and aerodynamic loads of FOWTs by measuring platform motion responses and wave-elevation sequences.First,a computational fluid dynamics(CFD)simulation model of the floating platform was established based on the dynamic fluid body interaction technique and overset grid technology.Then,a long short-term memory(LSTM)neural network model was constructed and trained to learn the nonlinear relationship between the waves,platform-motion inputs,and hydrodynamic-load outputs.The optimal model was determined after analyzing the sensitivity of parameters such as sample characteristics,network layers,and neuron numbers.Subsequently,the effectiveness of the hydrodynamic load model was validated under different simulation conditions,and the aerodynamic load calculation was completed based on the D'Alembert principle.Finally,we built a hybrid-scale FOWT model,based on the software in the loop strategy,in which the wind turbine was replaced by an actuation system.Model tests were carried out in a wave basin and the results demonstrated that the root mean square errors of the hydrodynamic and aerodynamic load measurements were 4.20%and 10.68%,respectively.
基金supported by the National Key Research and Development Program of China(No.2021YFB2600300).
文摘Intelligent maintenance of roads and highways requires accurate deterioration evaluation and performance prediction of asphalt pavement.To this end,we develop a time series long short-term memory(LSTM)model to predict key performance indicators(PIs)of pavement,namely the international roughness index(IRI)and rutting depth(RD).Subsequently,we propose a comprehensive performance indicator for the pavement quality index(PQI),which leverages the highway performance assessment standard method,entropy weight method,and fuzzy comprehensive evaluation method.This indicator can evaluate the overall performance condition of the pavement.The data used for the model development and analysis are extracted from tests on two full-scale accelerated test tracks,called MnRoad and RIOHTrack.Six variables are used as predictors,including temperature,precipitation,total traffic volume,asphalt surface layer thickness,pavement age,and maintenance condition.Furthermore,wavelet denoising is performed to analyze the impact of missing or abnormal data on the LSTM model accuracy.In comparison to a traditional autoregressive integrated moving average(ARIMAX)model,the proposed LSTM model performs better in terms of PI prediction and resiliency to noise.Finally,the overall prediction accuracy of our proposed performance indicator PQI is 93.8%.
基金supported by the Open Fund of Magnetic Confinement Fusion Laboratory of Anhui Province(No.2024AMF04003)the Natural Science Foundation of Anhui Province(No.228085ME142)Comprehensive Research Facility for Fusion Technology(No.20180000527301001228)。
文摘This research focuses on solving the fault detection and health monitoring of high-power thyristor converter.In terms of the critical role of thyristor converter in nuclear fusion system,a method based on long short-term memory(LSTM)neural network model is proposed to monitor the operational state of the converter and accurately detect faults as they occur.By sampling and processing a large number of thyristor converter operation data,the LSTM model is trained to identify and detect abnormal state,and the power supply health status is monitored.Compared with traditional methods,LSTM model shows higher accuracy and abnormal state detection ability.The experimental results show that this method can effectively improve the reliability and safety of the thyristor converter,and provide a strong guarantee for the stable operation of the nuclear fusion reactor.
基金National Key Research and Development Program of China (Grant No. 2022YFE0102700)National Natural Science Foundation of China (Grant No. 52102420)+2 种基金research project “Safe Da Batt” (03EMF0409A) funded by the German Federal Ministry of Digital and Transport (BMDV)China Postdoctoral Science Foundation (Grant No. 2023T160085)Sichuan Science and Technology Program (Grant No. 2024NSFSC0938)。
文摘A fast-charging policy is widely employed to alleviate the inconvenience caused by the extended charging time of electric vehicles. However, fast charging exacerbates battery degradation and shortens battery lifespan. In addition, there is still a lack of tailored health estimations for fast-charging batteries;most existing methods are applicable at lower charging rates. This paper proposes a novel method for estimating the health of lithium-ion batteries, which is tailored for multi-stage constant current-constant voltage fast-charging policies. Initially, short charging segments are extracted by monitoring current switches,followed by deriving voltage sequences using interpolation techniques. Subsequently, a graph generation layer is used to transform the voltage sequence into graphical data. Furthermore, the integration of a graph convolution network with a long short-term memory network enables the extraction of information related to inter-node message transmission, capturing the key local and temporal features during the battery degradation process. Finally, this method is confirmed by utilizing aging data from 185 cells and 81 distinct fast-charging policies. The 4-minute charging duration achieves a balance between high accuracy in estimating battery state of health and low data requirements, with mean absolute errors and root mean square errors of 0.34% and 0.66%, respectively.
文摘With the advancement of artificial intelligence,traffic forecasting is gaining more and more interest in optimizing route planning and enhancing service quality.Traffic volume is an influential parameter for planning and operating traffic structures.This study proposed an improved ensemble-based deep learning method to solve traffic volume prediction problems.A set of optimal hyperparameters is also applied for the suggested approach to improve the performance of the learning process.The fusion of these methodologies aims to harness ensemble empirical mode decomposition’s capacity to discern complex traffic patterns and long short-term memory’s proficiency in learning temporal relationships.Firstly,a dataset for automatic vehicle identification is obtained and utilized in the preprocessing stage of the ensemble empirical mode decomposition model.The second aspect involves predicting traffic volume using the long short-term memory algorithm.Next,the study employs a trial-and-error approach to select a set of optimal hyperparameters,including the lookback window,the number of neurons in the hidden layers,and the gradient descent optimization.Finally,the fusion of the obtained results leads to a final traffic volume prediction.The experimental results show that the proposed method outperforms other benchmarks regarding various evaluation measures,including mean absolute error,root mean squared error,mean absolute percentage error,and R-squared.The achieved R-squared value reaches an impressive 98%,while the other evaluation indices surpass the competing.These findings highlight the accuracy of traffic pattern prediction.Consequently,this offers promising prospects for enhancing transportation management systems and urban infrastructure planning.
基金supported by the National Research and Development Program(2022YFC3004603)the Jiangsu Province International Collaboration Program-Key National Industrial Technology Research and Development Cooperation Projects(BZ2023050)+1 种基金the Natural Science Foundation of Jiangsu Province(BK20221109)the National Natural Science Foundation of China(52274098).
文摘The increasingly severe state of coal burst disaster has emerged as a critical factor constraining coal mine safety production,and it has become a challenging task to enhance the accuracy of coal burst disaster prediction.To address the issue of insufficient exploration of the spatio-temporal characteristic of microseismic data and the challenging selection of the optimal time window size in spatio-temporal prediction,this paper integrates deep learning methods and theory to propose a novel coal burst spatio-temporal prediction method based on Bidirectional Long Short-Term Memory(Bi-LSTM)network.The method involves three main modules,including microseismic spatio-temporal characteristic indicators construction,temporal prediction model,and spatial prediction model.To validate the effectiveness of the proposed method,engineering application tests are conducted at a high-risk working face in the Ordos mining area of Inner Mongolia,focusing on 13 high-energy microseismic events with energy levels greater than 105 J.In terms of temporal prediction,the analysis indicates that the temporal prediction results consist of 10 strong predictions and 3 medium predictions,and there is no false alarm detected throughout the entire testing period.Moreover,compared to the traditional threshold-based coal burst temporal prediction method,the accuracy of the proposed method is increased by 38.5%.In terms of spatial prediction,the distribution of spatial prediction results for high-energy events comprises 6 strong hazard predictions,3 medium hazard predictions,and 4 weak hazard predictions.
基金supported by the US Department of Energy (DOE),the Office of Nuclear Energy,Spent Fuel and Waste Science and Technology Campaign,under Contract Number DE-AC02-05CH11231the National Energy Technology Laboratory under the award number FP00013650 at Lawrence Berkeley National Laboratory.
文摘Stress changes due to changes in fluid pressure and temperature in a faulted formation may lead to the opening/shearing of the fault.This can be due to subsurface(geo)engineering activities such as fluid injections and geologic disposal of nuclear waste.Such activities are expected to rise in the future making it necessary to assess their short-and long-term safety.Here,a new machine learning(ML)approach to model pore pressure and fault displacements in response to high-pressure fluid injection cycles is developed.The focus is on fault behavior near the injection borehole.To capture the temporal dependencies in the data,long short-term memory(LSTM)networks are utilized.To prevent error accumulation within the forecast window,four critical measures to train a robust LSTM model for predicting fault response are highlighted:(i)setting an appropriate value of LSTM lag,(ii)calibrating the LSTM cell dimension,(iii)learning rate reduction during weight optimization,and(iv)not adopting an independent injection cycle as a validation set.Several numerical experiments were conducted,which demonstrated that the ML model can capture peaks in pressure and associated fault displacement that accompany an increase in fluid injection.The model also captured the decay in pressure and displacement during the injection shut-in period.Further,the ability of an ML model to highlight key changes in fault hydromechanical activation processes was investigated,which shows that ML can be used to monitor risk of fault activation and leakage during high pressure fluid injections.
基金Deanship of Research and Graduate Studies at King Khalid University for funding this work through Small Group Research Project under Grant Number RGP1/261/45.
文摘Breast cancer is a significant threat to the global population,affecting not only women but also a threat to the entire population.With recent advancements in digital pathology,Eosin and hematoxylin images provide enhanced clarity in examiningmicroscopic features of breast tissues based on their staining properties.Early cancer detection facilitates the quickening of the therapeutic process,thereby increasing survival rates.The analysis made by medical professionals,especially pathologists,is time-consuming and challenging,and there arises a need for automated breast cancer detection systems.The upcoming artificial intelligence platforms,especially deep learning models,play an important role in image diagnosis and prediction.Initially,the histopathology biopsy images are taken from standard data sources.Further,the gathered images are given as input to the Multi-Scale Dilated Vision Transformer,where the essential features are acquired.Subsequently,the features are subjected to the Bidirectional Long Short-Term Memory(Bi-LSTM)for classifying the breast cancer disorder.The efficacy of the model is evaluated using divergent metrics.When compared with other methods,the proposed work reveals that it offers impressive results for detection.
文摘Wind power generation is among the most promising and eco-friendly energy sources today. Wind Power Forecasting (WPF) is essential for boosting energy efficiency and maintaining the operational stability of power grids. However, predicting wind power comes with significant challenges, such as weather uncertainties, wind variability, complex terrain, limited data, insufficient measurement infrastructure, intricate interdependencies, and short lead times. These factors make it difficult to accurately forecast wind behavior and respond to sudden power output changes. This study aims to precisely forecast electricity generation from wind turbines, minimize grid operation uncertainties, and enhance grid reliability. It leverages historical wind farm data and Numerical Weather Prediction data, using k-Nearest Neighbors for pre-processing, K-means clustering for categorization, and Long Short-Term Memory (LSTM) networks for training and testing, with model performance evaluated across multiple metrics. The Grey Wolf Optimized (GWO) LSTM classification technique, a deep learning model suited to time series analysis, effectively handles temporal dependencies in input data through memory cells and gradient-based optimization. Inspired by grey wolves’ hunting strategies, GWO is a population-based metaheuristic optimization algorithm known for its strong performance across diverse optimization tasks. The proposed Grey Wolf Optimized Deep Learning model achieves an R-squared value of 0.97279, demonstrating that it explains 97.28% of the variance in wind power data. This model surpasses a reference study that achieved an R-squared value of 0.92 with a hybrid deep learning approach but did not account for outliers or anomalous data.
基金financially supported by the National Natural Science Foundation of China (Grant Nos. 52074258, 41941018, and U21A20153)
文摘Based on data from the Jilin Water Diversion Tunnels from the Songhua River(China),an improved and real-time prediction method optimized by multi-algorithm for tunnel boring machine(TBM)cutter-head torque is presented.Firstly,a function excluding invalid and abnormal data is established to distinguish TBM operating state,and a feature selection method based on the SelectKBest algorithm is proposed.Accordingly,ten features that are most closely related to the cutter-head torque are selected as input variables,which,in descending order of influence,include the sum of motor torque,cutter-head power,sum of motor power,sum of motor current,advance rate,cutter-head pressure,total thrust force,penetration rate,cutter-head rotational velocity,and field penetration index.Secondly,a real-time cutterhead torque prediction model’s structure is developed,based on the bidirectional long short-term memory(BLSTM)network integrating the dropout algorithm to prevent overfitting.Then,an algorithm to optimize hyperparameters of model based on Bayesian and cross-validation is proposed.Early stopping and checkpoint algorithms are integrated to optimize the training process.Finally,a BLSTMbased real-time cutter-head torque prediction model is developed,which fully utilizes the previous time-series tunneling information.The mean absolute percentage error(MAPE)of the model in the verification section is 7.3%,implying that the presented model is suitable for real-time cutter-head torque prediction.Furthermore,an incremental learning method based on the above base model is introduced to improve the adaptability of the model during the TBM tunneling.Comparison of the prediction performance between the base and incremental learning models in the same tunneling section shows that:(1)the MAPE of the predicted results of the BLSTM-based real-time cutter-head torque prediction model remains below 10%,and both the coefficient of determination(R^(2))and correlation coefficient(r)between measured and predicted values exceed 0.95;and(2)the incremental learning method is suitable for realtime cutter-head torque prediction and can effectively improve the prediction accuracy and generalization capacity of the model during the excavation process.
基金funded by Fujian Science and Technology Key Project(No.2016H6022,2018J01099,2017H0037)
文摘The fraction defective of semi-finished products is predicted to optimize the process of relay production lines, by which production quality and productivity are increased, and the costs are decreased. The process parameters of relay production lines are studied based on the long-and-short-term memory network. Then, the Keras deep learning framework is utilized to build up a short-term relay quality prediction algorithm for the semi-finished product. A simulation model is used to study prediction algorithm. The simulation results show that the average prediction absolute error of the fraction is less than 5%. This work displays great application potential in the relay production lines.
基金National Key R&D Program of China(No.2020YFB1707700)。
文摘The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.
基金supported by the Ministry of Trade,Industry & Energy(MOTIE,Korea) under Industrial Technology Innovation Program (No.10063424,'development of distant speech recognition and multi-task dialog processing technologies for in-door conversational robots')
文摘A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a forced aligned Hidden Markov Model(HMM) state sequence obtained from the GMM-based acoustic model. Therefore, it requires a long computation time for training both the GMM-based acoustic model and a deep learning-based acoustic model. In order to solve this problem, an acoustic model using CTC algorithm is proposed. CTC algorithm does not require the GMM-based acoustic model because it does not use the forced aligned HMM state sequence. However, previous works on a LSTM RNN-based acoustic model using CTC used a small-scale training corpus. In this paper, the LSTM RNN-based acoustic model using CTC is trained on a large-scale training corpus and its performance is evaluated. The implemented acoustic model has a performance of 6.18% and 15.01% in terms of Word Error Rate(WER) for clean speech and noisy speech, respectively. This is similar to a performance of the acoustic model based on the hybrid method.
文摘A correct and timely fault diagnosis is important for improving the safety and reliability of chemical processes. With the advancement of big data technology, data-driven fault diagnosis methods are being extensively used and still have considerable potential. In recent years, methods based on deep neural networks have made significant breakthroughs, and fault diagnosis methods for industrial processes based on deep learning have attracted considerable research attention. Therefore, we propose a fusion deeplearning algorithm based on a fully convolutional neural network(FCN) to extract features and build models to correctly diagnose all types of faults. We use long short-term memory(LSTM) units to expand our proposed FCN so that our proposed deep learning model can better extract the time-domain features of chemical process data. We also introduce the attention mechanism into the model, aimed at highlighting the importance of features, which is significant for the fault diagnosis of chemical processes with many features. When applied to the benchmark Tennessee Eastman process, our proposed model exhibits impressive performance, demonstrating the effectiveness of the attention-based LSTM FCN in chemical process fault diagnosis.
基金The National Key R&D Program of China under contract No.2016YFC1402103
文摘To explore new operational forecasting methods of waves,a forecasting model for wave heights at three stations in the Bohai Sea has been developed.This model is based on long short-term memory(LSTM)neural network with sea surface wind and wave heights as training samples.The prediction performance of the model is evaluated,and the error analysis shows that when using the same set of numerically predicted sea surface wind as input,the prediction error produced by the proposed LSTM model at Sta.N01 is 20%,18%and 23%lower than the conventional numerical wave models in terms of the total root mean square error(RMSE),scatter index(SI)and mean absolute error(MAE),respectively.Particularly,for significant wave height in the range of 3–5 m,the prediction accuracy of the LSTM model is improved the most remarkably,with RMSE,SI and MAE all decreasing by 24%.It is also evident that the numbers of hidden neurons,the numbers of buoys used and the time length of training samples all have impact on the prediction accuracy.However,the prediction does not necessary improve with the increase of number of hidden neurons or number of buoys used.The experiment trained by data with the longest time length is found to perform the best overall compared to other experiments with a shorter time length for training.Overall,long short-term memory neural network was proved to be a very promising method for future development and applications in wave forecasting.
基金funded by the National Natural Science Foundation of China (41807285)。
文摘The numerical simulation and slope stability prediction are the focus of slope disaster research.Recently,machine learning models are commonly used in the slope stability prediction.However,these machine learning models have some problems,such as poor nonlinear performance,local optimum and incomplete factors feature extraction.These issues can affect the accuracy of slope stability prediction.Therefore,a deep learning algorithm called Long short-term memory(LSTM)has been innovatively proposed to predict slope stability.Taking the Ganzhou City in China as the study area,the landslide inventory and their characteristics of geotechnical parameters,slope height and slope angle are analyzed.Based on these characteristics,typical soil slopes are constructed using the Geo-Studio software.Five control factors affecting slope stability,including slope height,slope angle,internal friction angle,cohesion and volumetric weight,are selected to form different slope and construct model input variables.Then,the limit equilibrium method is used to calculate the stability coefficients of these typical soil slopes under different control factors.Each slope stability coefficient and its corresponding control factors is a slope sample.As a result,a total of 2160 training samples and 450 testing samples are constructed.These sample sets are imported into LSTM for modelling and compared with the support vector machine(SVM),random forest(RF)and convo-lutional neural network(CNN).The results show that the LSTM overcomes the problem that the commonly used machine learning models have difficulty extracting global features.Furthermore,LSTM has a better prediction performance for slope stability compared to SVM,RF and CNN models.