期刊文献+
共找到60篇文章
< 1 2 3 >
每页显示 20 50 100
Prediction of Self-Care Behaviors in Patients Using High-Density Surface Electromyography Signals and an Improved Whale Optimization Algorithm-Based LSTM Model
1
作者 Shuai Huang Dan Liu +4 位作者 Youfa Fu Jiadui Chen Ling He Jing Yan Di Yang 《Journal of Bionic Engineering》 2025年第4期1963-1984,共22页
Stroke survivors often face significant challenges when performing daily self-care activities due to upper limb motor impairments.Traditional surface electromyography(sEMG)analysis typically focuses on isolated hand p... Stroke survivors often face significant challenges when performing daily self-care activities due to upper limb motor impairments.Traditional surface electromyography(sEMG)analysis typically focuses on isolated hand postures,overlooking the complexity of object-interactive behaviors that are crucial for promoting patient independence.This study introduces a novel framework that combines high-density sEMG(HD-sEMG)signals with an improved Whale Optimization Algorithm(IWOA)-optimized Long Short-Term Memory(LSTM)network to address this limitation.The key contributions of this work include:(1)the creation of a specialized HD-sEMG dataset that captures nine continuous self-care behaviors,along with time and posture markers,to better reflect real-world patient interactions;(2)the development of a multi-channel feature fusion module based on Pascal’s theorem,which enables efficient signal segmentation and spatial–temporal feature extraction;and(3)the enhancement of the IWOA algorithm,which integrates optimal point set initialization,a diversity-driven pooling mechanism,and cosine-based differential evolution to optimize LSTM hyperparameters,thereby improving convergence and global search capabilities.Experimental results demonstrate superior performance,achieving 99.58%accuracy in self-care behavior recognition and 86.19%accuracy for 17 continuous gestures on the Ninapro db2 benchmark.The framework operates with low latency,meeting the real-time requirements for assistive devices.By enabling precise,context-aware recognition of daily activities,this work advances personalized rehabilitation technologies,empowering stroke patients to regain autonomy in self-care tasks.The proposed methodology offers a robust,scalable solution for clinical applications,bridging the gap between laboratory-based gesture recognition and practical,patient-centered care. 展开更多
关键词 Self-care behaviors High-density surface electromyography(HD-sEMG) Long Short-Term Memory(lstm)network Multi-channel feature fusion
在线阅读 下载PDF
Load-measurement method for floating offshore wind turbines based on a long short-term memory (LSTM) neural network
2
作者 Yonggang LIN Xiangheng FENG +1 位作者 Hongwei LIU Yong SUN 《Journal of Zhejiang University-Science A(Applied Physics & Engineering)》 2025年第5期456-470,共15页
Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,w... Complicated loads encountered by floating offshore wind turbines(FOWTs)in real sea conditions are crucial for future optimization of design,but obtaining data on them directly poses a challenge.To address this issue,we applied machine learning techniques to obtain hydrodynamic and aerodynamic loads of FOWTs by measuring platform motion responses and wave-elevation sequences.First,a computational fluid dynamics(CFD)simulation model of the floating platform was established based on the dynamic fluid body interaction technique and overset grid technology.Then,a long short-term memory(LSTM)neural network model was constructed and trained to learn the nonlinear relationship between the waves,platform-motion inputs,and hydrodynamic-load outputs.The optimal model was determined after analyzing the sensitivity of parameters such as sample characteristics,network layers,and neuron numbers.Subsequently,the effectiveness of the hydrodynamic load model was validated under different simulation conditions,and the aerodynamic load calculation was completed based on the D'Alembert principle.Finally,we built a hybrid-scale FOWT model,based on the software in the loop strategy,in which the wind turbine was replaced by an actuation system.Model tests were carried out in a wave basin and the results demonstrated that the root mean square errors of the hydrodynamic and aerodynamic load measurements were 4.20%and 10.68%,respectively. 展开更多
关键词 Floating offshore wind turbine(FOWT) Long short-term memory(lstm)neural network Machine learning technique Load measurement Hybrid-scale model test
原文传递
Anti⁃acceleration Saturation Terminal Guidance Law Based on LSTM
3
作者 LI Guilin ZHOU Wei ZHANG Jiarui 《Transactions of Nanjing University of Aeronautics and Astronautics》 2025年第4期541-553,共13页
Missile acceleration saturation in a practical terminal guidance process may significantly reduce the interception performance.To solve this problem,this paper presents an anti-saturation guidance law with finite-time... Missile acceleration saturation in a practical terminal guidance process may significantly reduce the interception performance.To solve this problem,this paper presents an anti-saturation guidance law with finite-time convergence for a three dimensional maneuvering interception.The finite time boundedness(FTB)theory and the input-output finite time stability(IO-FTS)theory are used,as well as the long short-term memory(LSTM)network.A sufficient condition for FTB and IO-FTS of a class of nonlinear systems is given.Then,an anti-acceleration saturation missile terminal guidance law based on LSTM,namely LSTM-ASGL,is designed.It can effectively suppress the effect of acceleration saturation to track the maneuvering target more accurately in the complex dynamic environment.The excellent performance of LSTM-ASGL in different maneuvering target scenarios is verified by simulation.The simulation results show that the guidance law successfully prevents acceleration saturation and improves the tracking ability of the missile system to the maneuvering target.It is also shown that LSTM-ASGL has good generalization and anti-jamming performance,and consumes less energy than the anti-acceleration saturation terminal guidance law. 展开更多
关键词 long short-term memory network(lstm) anti-acceleration saturation terminal guidance law finite-time boundedness(FTB) input-output finite time stability(IO-FTS)
在线阅读 下载PDF
Nuclear atypia grading in breast cancer histopathological images based on CNN feature extraction and LSTM classification 被引量:4
4
作者 Sanaz Karimi Jafarbigloo Habibollah Danyali 《CAAI Transactions on Intelligence Technology》 EI 2021年第4期426-439,共14页
Early diagnosis of breast cancer,the most common disease among women around the world,increases the chance of treatment and is highly important.Nuclear atypia grading in histopathological images plays an important rol... Early diagnosis of breast cancer,the most common disease among women around the world,increases the chance of treatment and is highly important.Nuclear atypia grading in histopathological images plays an important role in the final diagnosis and grading ofbreast cancer.Grading images by pathologists is a time consuming and subjective task.Therefore,the existence of a computer-aided system for nuclear atypia grading is very useful and necessary;In this stud%two automatic systems for grading nuclear atypia in breast cancer histopathological images based on deep learning methods are proposed.A patch-based approach is introduced due to the large size of the histopathological images and restriction of the training data.In the proposed system I,the most important patches in the image are detected first and then a three-hidden-layer convolutional neural network(CNN)is designed and trained for feature extraction and to classify the patches individually.The proposed system II is based on a combination of the CNN for feature extraction and a two-layer Long short-term memoty(LSTM)network for classification.The LSTM network is utilised to consider all patches of an image simultaneously for image grading.The simulation results show the efficiency of the proposed systems for automatic nuclear atypia grading and outperform the current related studies in the literature. 展开更多
关键词 breast cancer CNN histopathological image lstm networks nuclear atypia
在线阅读 下载PDF
Dynamic Hand Gesture Recognition Based on Short-Term Sampling Neural Networks 被引量:14
5
作者 Wenjin Zhang Jiacun Wang Fangping Lan 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2021年第1期110-120,共11页
Hand gestures are a natural way for human-robot interaction.Vision based dynamic hand gesture recognition has become a hot research topic due to its various applications.This paper presents a novel deep learning netwo... Hand gestures are a natural way for human-robot interaction.Vision based dynamic hand gesture recognition has become a hot research topic due to its various applications.This paper presents a novel deep learning network for hand gesture recognition.The network integrates several well-proved modules together to learn both short-term and long-term features from video inputs and meanwhile avoid intensive computation.To learn short-term features,each video input is segmented into a fixed number of frame groups.A frame is randomly selected from each group and represented as an RGB image as well as an optical flow snapshot.These two entities are fused and fed into a convolutional neural network(Conv Net)for feature extraction.The Conv Nets for all groups share parameters.To learn longterm features,outputs from all Conv Nets are fed into a long short-term memory(LSTM)network,by which a final classification result is predicted.The new model has been tested with two popular hand gesture datasets,namely the Jester dataset and Nvidia dataset.Comparing with other models,our model produced very competitive results.The robustness of the new model has also been proved with an augmented dataset with enhanced diversity of hand gestures. 展开更多
关键词 Convolutional neural network(ConvNet) hand gesture recognition long short-term memory(lstm)network short-term sampling transfer learning
在线阅读 下载PDF
Self-feedback LSTM regression model for real-time particle source apportionment 被引量:2
6
作者 Wei Wang Weiman Xu +6 位作者 Shuai Deng Yimeng Chai Ruoyu Ma Guoliang Shi Bo Xu Mei Li Yue Li 《Journal of Environmental Sciences》 SCIE EI CAS CSCD 2022年第4期10-20,共11页
Atmospheric particulate matter pollution has attracted much wider attention globally.In recent years,the development of atmospheric particle collection techniques has put forwards new demands on the real-time source a... Atmospheric particulate matter pollution has attracted much wider attention globally.In recent years,the development of atmospheric particle collection techniques has put forwards new demands on the real-time source apportionments techniques.Such demands are summarized,in this paper,as how to set up new restraints in apportionment and how to develop a non-linear regression model to process complicated circumstances,such as the existence of secondary source and similar source.In this study,we firstly analyze the possible and potential restraints in single particle source apportionment,then propose a novel three-step self-feedback long short-term memory(SF-LSTM)network for approximating the source contribution.The proposed deep learning neural network includes three modules,as generation,scoring and refining,and regeneration modules.Benefited from the scoring modules,SF-LSTM implants four loss functions representing four restraints to be followed in the apportionment,meanwhile,the regeneration module calculates the source contribution in a non-linear way.The results show that the model outperforms the conventional regression methods in the overall performance of the four evaluation indicators(residual sum of squares,stability,sparsity,negativity)for the restraints.Additionally,in short time-resolution analyzing,SF-LSTM provides better results under the restraint of stability. 展开更多
关键词 Time series Regression Self-feedback lstm network Particle source apportionment
原文传递
Short-Term Prediction of Photovoltaic Power Based on DBSCAN-SVM Data Cleaning and PSO-LSTM Model 被引量:3
7
作者 Yujin Liu Zhenkai Zhang +3 位作者 Li Ma Yan Jia Weihua Yin Zhifeng Liu 《Energy Engineering》 EI 2024年第10期3019-3035,共17页
Accurate short-termphotovoltaic(PV)power prediction helps to improve the economic efficiency of power stations and is of great significance to the arrangement of grid scheduling plans.In order to improve the accuracy ... Accurate short-termphotovoltaic(PV)power prediction helps to improve the economic efficiency of power stations and is of great significance to the arrangement of grid scheduling plans.In order to improve the accuracy of PV power prediction further,this paper proposes a data cleaning method combining density clustering and support vector machine.It constructs a short-termPVpower predictionmodel based on particle swarmoptimization(PSO)optimized Long Short-Term Memory(LSTM)network.Firstly,the input features are determined using Pearson’s correlation coefficient.The feature information is clustered using density-based spatial clustering of applications withnoise(DBSCAN),and then,the data in each cluster is cleanedusing support vectormachines(SVM).Secondly,the PSO is used to optimize the hyperparameters of the LSTM network to obtain the optimal network structure.Finally,different power prediction models are established,and the PV power generation prediction results are obtained.The results show that the data methods used are effective and that the PSO-LSTM power prediction model based on DBSCAN-SVM data cleaning outperforms existing typical methods,especially under non-sunny days,and that the model effectively improves the accuracy of short-term PV power prediction. 展开更多
关键词 Photovoltaic power prediction lstm network DBSCAN-SVM PSO deep learning
在线阅读 下载PDF
Massive Files Prefetching Model Based on LSTM Neural Network with Cache Transaction Strategy 被引量:3
8
作者 Dongjie Zhu Haiwen Du +6 位作者 Yundong Sun Xiaofang Li Rongning Qu Hao Hu Shuangshuang Dong Helen Min Zhou Ning Cao 《Computers, Materials & Continua》 SCIE EI 2020年第5期979-993,共15页
In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches d... In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches data before it is needed according to the file access pattern,which can reduce the I/O waiting time and increase the system concurrency.However,prefetching model needs to mine the degree of association between files to ensure the accuracy of prefetching.In the massive small file situation,the sheer volume of files poses a challenge to the efficiency and accuracy of relevance mining.In this paper,we propose a massive files prefetching model based on LSTM neural network with cache transaction strategy to improve file access efficiency.Firstly,we propose a file clustering algorithm based on temporal locality and spatial locality to reduce the computational complexity.Secondly,we propose a definition of cache transaction according to files occurrence in cache instead of time-offset distance based methods to extract file block feature accurately.Lastly,we innovatively propose a file access prediction algorithm based on LSTM neural network which predict the file that have high possibility to be accessed.Experiments show that compared with the traditional LRU and the plain grouping methods,the proposed model notably increase the cache hit rate and effectively reduces the I/O wait time. 展开更多
关键词 Massive files prefetching model cache transaction distributed storage systems lstm neural network
在线阅读 下载PDF
Feedback LSTM Network Based on Attention for Image Description Generator 被引量:2
9
作者 Zhaowei Qu Bingyu Cao +3 位作者 Xiaoru Wang Fu Li Peirong Xu Luhan Zhang 《Computers, Materials & Continua》 SCIE EI 2019年第5期575-589,共15页
Images are complex multimedia data which contain rich semantic information.Most of current image description generator algorithms only generate plain description,with the lack of distinction between primary and second... Images are complex multimedia data which contain rich semantic information.Most of current image description generator algorithms only generate plain description,with the lack of distinction between primary and secondary object,leading to insufficient high-level semantic and accuracy under public evaluation criteria.The major issue is the lack of effective network on high-level semantic sentences generation,which contains detailed description for motion and state of the principal object.To address the issue,this paper proposes the Attention-based Feedback Long Short-Term Memory Network(AFLN).Based on existing codec framework,there are two independent sub tasks in our method:attention-based feedback LSTM network during decoding and the Convolutional Block Attention Module(CBAM)in the coding phase.First,we propose an attentionbased network to feedback the features corresponding to the generated word from the previous LSTM decoding unit.We implement feedback guidance through the related field mapping algorithm,which quantifies the correlation between previous word and latter word,so that the main object can be tracked with highlighted detailed description.Second,we exploit the attention idea and apply a lightweight and general module called CBAM after the last layer of VGG 16 pretraining network,which can enhance the expression of image coding features by combining channel and spatial dimension attention maps with negligible overheads.Extensive experiments on COCO dataset validate the superiority of our network over the state-of-the-art algorithms.Both scores and actual effects are proved.The BLEU 4 score increases from 0.291 to 0.301 while the CIDEr score rising from 0.912 to 0.952. 展开更多
关键词 Image description generator feedback lstm network ATTENTION CBAM
在线阅读 下载PDF
Short-time prediction for traffic flow based on wavelet de-noising and LSTM model 被引量:3
10
作者 WANG Qingrong LI Tongwei ZHU Changfeng 《Journal of Measurement Science and Instrumentation》 CAS CSCD 2021年第2期195-207,共13页
Aiming at the problem that some existing traffic flow prediction models are only for a single road segment and the model input data are not pre-processed,a heuristic threshold algorithm is used to de-noise the origina... Aiming at the problem that some existing traffic flow prediction models are only for a single road segment and the model input data are not pre-processed,a heuristic threshold algorithm is used to de-noise the original traffic flow data after wavelet decomposition.The correlation coefficients of road traffic flow data are calculated and the data compression matrix of road traffic flow is constructed.Data de-noising minimizes the interference of data to the model,while the correlation analysis of road network data realizes the prediction at the road network level.Utilizing the advantages of long short term memory(LSTM)network in time series data processing,the compression matrix is input into the constructed LSTM model for short-term traffic flow prediction.The LSTM-1 and LSTM-2 models were respectively trained by de-noising processed data and original data.Through simulation experiments,different prediction times were set,and the prediction results of the prediction model proposed in this paper were compared with those of other methods.It is found that the accuracy of the LSTM-2 model proposed in this paper increases by 10.278%on average compared with other prediction methods,and the prediction accuracy reaches 95.58%,which proves that the short-term traffic flow prediction method proposed in this paper is efficient. 展开更多
关键词 short-term traffic flow prediction deep learning wavelet denoising network matrix compression long short term memory(lstm)network
在线阅读 下载PDF
Behavior recognition based on the fusion of 3D-BN-VGG and LSTM network 被引量:4
11
作者 Wu Jin Min Yu +2 位作者 Shi Qianwen Zhang Weihua Zhao Bo 《High Technology Letters》 EI CAS 2020年第4期372-382,共11页
In order to effectively solve the problems of low accuracy,large amount of computation and complex logic of deep learning algorithms in behavior recognition,a kind of behavior recognition based on the fusion of 3 dime... In order to effectively solve the problems of low accuracy,large amount of computation and complex logic of deep learning algorithms in behavior recognition,a kind of behavior recognition based on the fusion of 3 dimensional batch normalization visual geometry group(3D-BN-VGG)and long short-term memory(LSTM)network is designed.In this network,3D convolutional layer is used to extract the spatial domain features and time domain features of video sequence at the same time,multiple small convolution kernels are stacked to replace large convolution kernels,thus the depth of neural network is deepened and the number of network parameters is reduced.In addition,the latest batch normalization algorithm is added to the 3-dimensional convolutional network to improve the training speed.Then the output of the full connection layer is sent to LSTM network as the feature vectors to extract the sequence information.This method,which directly uses the output of the whole base level without passing through the full connection layer,reduces the parameters of the whole fusion network to 15324485,nearly twice as much as those of 3D-BN-VGG.Finally,it reveals that the proposed network achieves 96.5%and 74.9%accuracy in the UCF-101 and HMDB-51 respectively,and the algorithm has a calculation speed of 1066 fps and an acceleration ratio of 1,which has a significant predominance in velocity. 展开更多
关键词 behavior recognition deep learning 3 dimensional batch normalization visual geometry group(3D-BN-VGG) long short-term memory(lstm)network
在线阅读 下载PDF
Sensitivity analysis of regional rainfall-induced landslide based on UAV photogrammetry and LSTM neural network 被引量:1
12
作者 ZHAO Lian-heng XU Xin +3 位作者 LYU Guo-shun HUANG Dong-liang LIU Min CHEN Qi-min 《Journal of Mountain Science》 SCIE CSCD 2023年第11期3312-3326,共15页
Rainfall stands out as a critical trigger for landslides,particularly given the intense summer rainfall experienced in Zheduotang,a transitional zone from the southwest edge of Sichuan Basin to Qinghai Tibet Plateau.T... Rainfall stands out as a critical trigger for landslides,particularly given the intense summer rainfall experienced in Zheduotang,a transitional zone from the southwest edge of Sichuan Basin to Qinghai Tibet Plateau.This area is characterized by adverse geological conditions such as rock piles,debris slopes and unstable slopes.Furthermore,due to the absence of historical rainfall records and landslide inventories,empirical methods are not applicable for the analysis of rainfall-induced landslides.Thus we employ a physically based landslide susceptibility analysis model by using highprecision unmanned aerial vehicle(UAV)photogrammetry,field boreholes and long short term memory(LSTM)neural network to obtain regional topography,soil properties,and rainfall parameters.We applied the Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability(TRIGRS)model to simulate the distribution of shallow landslides and variations in porewater pressure across the region under different rainfall intensities and three rainfall patterns(advanced,uniform,and delayed).The landslides caused by advanced rainfall pattern mostly occurred in the first 12 hours,but the landslides caused by delayed rainfall pattern mostly occurred in the last 12 hours.However,all the three rainfall patterns yielded landslide susceptibility zones categorized as high(1.16%),medium(8.06%),and low(90.78%).Furthermore,total precipitation with a rainfall intensity of 35 mm/h for 1 hour was less than that with a rainfall intensity of 1.775 mm/h for 24hours,but the areas with high and medium susceptibility increased by 3.1%.This study combines UAV photogrammetry and LSTM neural networks to obtain more accurate input data for the TRIGRS model,offering an effective approach for predicting rainfall-induced shallow landslides in regions lacking historical rainfall records and landslide inventories. 展开更多
关键词 Regional landslide TRIGRS UAV photography Rainfall landslide lstm neural network
原文传递
A Regularized LSTM Method for Predicting Remaining Useful Life of Rolling Bearings 被引量:6
13
作者 Zhao-Hua Liu Xu-Dong Meng +4 位作者 Hua-Liang Wei Liang Chen Bi-Liang Lu Zhen-Heng Wang Lei Chen 《International Journal of Automation and computing》 EI CSCD 2021年第4期581-593,共13页
Rotating machinery is important to industrial production. Any failure of rotating machinery, especially the failure of rolling bearings, can lead to equipment shutdown and even more serious incidents. Therefore, accur... Rotating machinery is important to industrial production. Any failure of rotating machinery, especially the failure of rolling bearings, can lead to equipment shutdown and even more serious incidents. Therefore, accurate residual life prediction plays a crucial role in guaranteeing machine operation safety and reliability and reducing maintenance cost. In order to increase the forecasting precision of the remaining useful life(RUL) of the rolling bearing, an advanced approach combining elastic net with long short-time memory network(LSTM) is proposed, and the new approach is referred to as E-LSTM. The E-LSTM algorithm consists of an elastic mesh and LSTM, taking temporal-spatial correlation into consideration to forecast the RUL through the LSTM. To solve the over-fitting problem of the LSTM neural network during the training process, the elastic net based regularization term is introduced to the LSTM structure.In this way, the change of the output can be well characterized to express the bearing degradation mode. Experimental results from the real-world data demonstrate that the proposed E-LSTM method can obtain higher stability and relevant values that are useful for the RUL forecasting of bearing. Furthermore, these results also indicate that E-LSTM can achieve better performance. 展开更多
关键词 Deep learning fault diagnosis fault prognosis long and short time memory network(lstm) rolling bearing rotating machinery REGULARIZATION remaining useful life prediction(RUL) recurrent neural network(RNN)
原文传递
Prediction of surface subsidence in Changchun City based on LSTM network 被引量:1
14
作者 WANG He WU Qiong 《Global Geology》 2022年第2期109-115,共7页
Monitoring and predicting of urban surface subsidence are important for urban disaster prevention and mitigation.In this paper,the Long Short-Term Memory(LSTM)network was used to predict the surface subsidence process... Monitoring and predicting of urban surface subsidence are important for urban disaster prevention and mitigation.In this paper,the Long Short-Term Memory(LSTM)network was used to predict the surface subsidence process of Changchun City from 2018 to 2020 based on PS-InSAR monitoring data.The results show that the prediction error of 57.89% of PS points in the LSTM network was less than 1mm with the average error of 1.8 mm and the standard deviation of 2.8 mm.The accuracy and reliability of the prediction were better than regression analysis,time series analysis and grey model. 展开更多
关键词 lstm neural network surface subsidence PS-INSAR
在线阅读 下载PDF
Sensory Data Prediction Using Spatiotemporal Correlation and LSTM Recurrent Neural Network 被引量:4
15
作者 Tongxin SHU 《Instrumentation》 2019年第3期10-17,共8页
The Wireless Sensor Networks(WSNs)are widely utilized in various industrial and environmental monitoring applications.The process of data gathering within the WSN is significant in terms of reporting the environmental... The Wireless Sensor Networks(WSNs)are widely utilized in various industrial and environmental monitoring applications.The process of data gathering within the WSN is significant in terms of reporting the environmental data.However,it might occur that certain sensor node malfunctions due to the energy draining out or unexpected damage.Therefore,the collected data may become inaccurate or incomplete.Focusing on the spatiotemporal correlation among sensor nodes,this paper proposes a novel algorithm to predict the value of the missing or inaccurate data and predict the future data in replacement of certain nonfunctional sensor nodes.The Long-Short-Term-Memory Recurrent Neural Network(LSTM RNN)helps to more accurately derive the time-series data corresponding to the sets of past collected data,making the prediction results more reliable.It is observed from the simulation results that the proposed algorithm provides an outstanding data gathering efficiency while ensuring the data accuracy. 展开更多
关键词 Spatiotemporal correlation lstm Recurrent Neural Network time-series prediction
原文传递
Deep Learning-Based Stock Price Prediction Using LSTM Model 被引量:1
16
作者 Jiayi Mao Zhiyong Wang 《Proceedings of Business and Economic Studies》 2024年第5期176-185,共10页
The stock market is a vital component of the broader financial system,with its dynamics closely linked to economic growth.The challenges associated with analyzing and forecasting stock prices have persisted since the ... The stock market is a vital component of the broader financial system,with its dynamics closely linked to economic growth.The challenges associated with analyzing and forecasting stock prices have persisted since the inception of financial markets.By examining historical transaction data,latent opportunities for profit can be uncovered,providing valuable insights for both institutional and individual investors to make more informed decisions.This study focuses on analyzing historical transaction data from four banks to predict closing price trends.Various models,including decision trees,random forests,and Long Short-Term Memory(LSTM)networks,are employed to forecast stock price movements.Historical stock transaction data serves as the input for training these models,which are then used to predict upward or downward stock price trends.The study’s empirical results indicate that these methods are effective to a degree in predicting stock price movements.The LSTM-based deep neural network model,in particular,demonstrates a commendable level of predictive accuracy.This conclusion is reached following a thorough evaluation of model performance,highlighting the potential of LSTM models in stock market forecasting.The findings offer significant implications for advancing financial forecasting approaches,thereby improving the decision-making capabilities of investors and financial institutions. 展开更多
关键词 Autoregressive integrated moving average(ARIMA)model Long Short-Term Memory(lstm)network Forecasting Stock market
在线阅读 下载PDF
LSTM Based Neural Network Model for Anomaly Event Detection in Care-Independent Smart Homes
17
作者 Brij B.Gupta Akshat Gaurav +3 位作者 Razaz Waheeb Attar Varsha Arya Ahmed Alhomoud Kwok Tai Chui 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第9期2689-2706,共18页
This study introduces a long-short-term memory(LSTM)-based neural network model developed for detecting anomaly events in care-independent smart homes,focusing on the critical application of elderly fall detection.It ... This study introduces a long-short-term memory(LSTM)-based neural network model developed for detecting anomaly events in care-independent smart homes,focusing on the critical application of elderly fall detection.It balances the dataset using the Synthetic Minority Over-sampling Technique(SMOTE),effectively neutralizing bias to address the challenge of unbalanced datasets prevalent in time-series classification tasks.The proposed LSTM model is trained on the enriched dataset,capturing the temporal dependencies essential for anomaly recognition.The model demonstrated a significant improvement in anomaly detection,with an accuracy of 84%.The results,detailed in the comprehensive classification and confusion matrices,showed the model’s proficiency in distinguishing between normal activities and falls.This study contributes to the advancement of smart home safety,presenting a robust framework for real-time anomaly monitoring. 展开更多
关键词 lstm neural networks anomaly detection smart home health-care elderly fall prevention
在线阅读 下载PDF
Track correlation algorithm based on CNN-LSTM for swarm targets
18
作者 CHEN Jinyang WANG Xuhua CHEN Xian 《Journal of Systems Engineering and Electronics》 SCIE CSCD 2024年第2期417-429,共13页
The rapid development of unmanned aerial vehicle(UAV) swarm, a new type of aerial threat target, has brought great pressure to the air defense early warning system. At present, most of the track correlation algorithms... The rapid development of unmanned aerial vehicle(UAV) swarm, a new type of aerial threat target, has brought great pressure to the air defense early warning system. At present, most of the track correlation algorithms only use part of the target location, speed, and other information for correlation.In this paper, the artificial neural network method is used to establish the corresponding intelligent track correlation model and method according to the characteristics of swarm targets.Precisely, a route correlation method based on convolutional neural networks (CNN) and long short-term memory (LSTM)Neural network is designed. In this model, the CNN is used to extract the formation characteristics of UAV swarm and the spatial position characteristics of single UAV track in the formation,while the LSTM is used to extract the time characteristics of UAV swarm. Experimental results show that compared with the traditional algorithms, the algorithm based on CNN-LSTM neural network can make full use of multiple feature information of the target, and has better robustness and accuracy for swarm targets. 展开更多
关键词 track correlation correlation accuracy rate swarm target convolutional neural network(CNN) long short-term memory(lstm)neural network
在线阅读 下载PDF
Deep Learning-Based Decision Support System for Predicting Pregnancy Risk Levels through Cardiotocograph(CTG)Imaging Analysis
19
作者 Ali Hasan Dakheel Mohammed Raheem Mohammed +1 位作者 Zainab Ali Abd Alhuseen Wassan Adnan Hashim 《Intelligent Automation & Soft Computing》 2025年第1期195-220,共26页
The prediction of pregnancy-related hazards must be accurate and timely to safeguard mother and fetal health.This study aims to enhance risk prediction in pregnancywith a novel deep learningmodel based on a Long Short... The prediction of pregnancy-related hazards must be accurate and timely to safeguard mother and fetal health.This study aims to enhance risk prediction in pregnancywith a novel deep learningmodel based on a Long Short-Term Memory(LSTM)generator,designed to capture temporal relationships in cardiotocography(CTG)data.This methodology integrates CTG signals with demographic characteristics and utilizes preprocessing techniques such as noise reduction,normalization,and segmentation to create high-quality input for themodel.It uses convolutional layers to extract spatial information,followed by LSTM layers to model sequences for superior predictive performance.The overall results show that themodel is robust,with an accuracy of 91.5%,precision of 89.8%,recall of 90.4%,and F1-score of 90.1%that outperformed the corresponding baselinemodels,CNN(Convolutional Neural Network)and traditional RNN(Recurrent Neural Network),by 2.3%and 6.1%,respectively.Rather,the ability to detect pregnancy-related abnormalities has considerable therapeutic potential,with the possibility for focused treatments and individualized maternal healthcare approaches,the research team concluded. 展开更多
关键词 Pregnancy risk prediction cardiotocography data analysis deep learning approach lstm network maternal-fetal healthcare predictive modeling
在线阅读 下载PDF
Anticipating Lag Synchronization Based on Machine Learning
20
作者 WU Yongqing BAO Xingxing 《数学理论与应用》 2025年第1期115-126,共12页
This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate m... This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate mathematical models are challenging to establish or where system equations remain unknown.The Long Short-Term Memory(LSTM)neural network is trained using time series acquired from the desynchronization system states,subsequently predicting the lag synchronization transition.In the experiments,we focus on the Lorenz system with time-varying delayed coupling,studying the effects of coupling coefficients and time delays on lag synchronization,respectively.The results indicate that with appropriate training,the machine learning model can adeptly predict the lag synchronization occurrence and transition.This study not only enhances our comprehension of complex network synchronization behaviors but also underscores the potential and practical applications of machine learning in exploring nonlinear dynamic systems. 展开更多
关键词 Coupled chaotic system lstm neural network Anticipating synchronization Lag synchronization
在线阅读 下载PDF
上一页 1 2 3 下一页 到第
使用帮助 返回顶部