PM_(2.5)constitutes a complex and diversemixture that significantly impacts the environment,human health,and climate change.However,existing observation and numerical simulation techniques have limitations,such as a l...PM_(2.5)constitutes a complex and diversemixture that significantly impacts the environment,human health,and climate change.However,existing observation and numerical simulation techniques have limitations,such as a lack of data,high acquisition costs,andmultiple uncertainties.These limitations hinder the acquisition of comprehensive information on PM_(2.5)chemical composition and effectively implement refined air pollution protection and control strategies.In this study,we developed an optimal deep learning model to acquire hourly mass concentrations of key PM_(2.5)chemical components without complex chemical analysis.The model was trained using a randomly partitioned multivariate dataset arranged in chronological order,including atmospheric state indicators,which previous studies did not consider.Our results showed that the correlation coefficients of key chemical components were no less than 0.96,and the root mean square errors ranged from 0.20 to 2.11μg/m^(3)for the entire process(training and testing combined).The model accurately captured the temporal characteristics of key chemical components,outperforming typical machine-learning models,previous studies,and global reanalysis datasets(such asModern-Era Retrospective analysis for Research and Applications,Version 2(MERRA-2)and Copernicus Atmosphere Monitoring Service ReAnalysis(CAMSRA)).We also quantified the feature importance using the random forest model,which showed that PM_(2.5),PM_(1),visibility,and temperature were the most influential variables for key chemical components.In conclusion,this study presents a practical approach to accurately obtain chemical composition information that can contribute to filling missing data,improved air pollution monitoring and source identification.This approach has the potential to enhance air pollution control strategies and promote public health and environmental sustainability.展开更多
This study aims to explore the application of Bayesian analysis based on neural networks and deep learning in data visualization.The research background is that with the increasing amount and complexity of data,tradit...This study aims to explore the application of Bayesian analysis based on neural networks and deep learning in data visualization.The research background is that with the increasing amount and complexity of data,traditional data analysis methods have been unable to meet the needs.Research methods include building neural networks and deep learning models,optimizing and improving them through Bayesian analysis,and applying them to the visualization of large-scale data sets.The results show that the neural network combined with Bayesian analysis and deep learning method can effectively improve the accuracy and efficiency of data visualization,and enhance the intuitiveness and depth of data interpretation.The significance of the research is that it provides a new solution for data visualization in the big data environment and helps to further promote the development and application of data science.展开更多
The reliability of an Engine Electronic Controller(EEC)attracts attention,which has a critical impact on aircraft engine safety.Reliability assessment is an important part of the design phase.However,the complex compo...The reliability of an Engine Electronic Controller(EEC)attracts attention,which has a critical impact on aircraft engine safety.Reliability assessment is an important part of the design phase.However,the complex composition of EEC and the characteristic of the Phased-Mission System(PMS)lead to the difficulty of assessment.This paper puts forward an advanced approach,considering the complex products and uncertain mission profiles to evaluate the Mean Time Between Failures(MTBF)in the design phase.The failure mechanisms of complex components are deduced by Bayesian Deep Learning(BDL)intelligent algorithm.And copious samples of reliability simulation are solved by cloud computing technology.Based on the result of BDL and cloud computing,simulations are conducted with the Physics of Failure(Po F)theory and Failure Behavior Model(FBM).This reliability assessment approach can evaluate MTBF of electronic products without reference to physical tests.Finally,an EEC is applied to verify the effectiveness and accuracy of the method.展开更多
In recent times,web intelligence(WI)has become a hot research topic,which utilizes Artificial Intelligence(AI)and advanced information technologies on theWeb and Internet.The users post reviews on social media and are...In recent times,web intelligence(WI)has become a hot research topic,which utilizes Artificial Intelligence(AI)and advanced information technologies on theWeb and Internet.The users post reviews on social media and are employed for sentiment analysis(SA),which acts as feedback to business people and government.Proper SA on the reviews helps to enhance the quality of the services and products,however,web intelligence techniques are needed to raise the company profit and user fulfillment.With this motivation,this article introduces a new modified pigeon inspired optimization based feature selection(MPIO-FS)with Bayesian deep learning(BDL),named MPIOBDL model for SA on WI applications.The presented MPIO-BDL model initially involved preprocessing and feature extraction take place using Term Frequency—Inverse Document Frequency(TF-IDF)technique to derive a useful set of information from the user reviews.Besides,the MPIO-FS model is applied for the selection of optimal feature subsets,which helps to enhance classification accuracy and reduce computation complexity.Moreover,the BDL model is employed to allocate the proper class labels of the applied user review data.A comprehensive experimental results analysis highlighted the improved classification efficiency of the presented model.展开更多
Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,ha...Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,have enabled automated fire detection in images and videos.Several deep learning models have been developed for object detection,including applications in fire and smoke detection.This study focuses on optimizing the training hyperparameters of YOLOv8 andYOLOv10models usingBayesianTuning(BT).Experimental results on the large-scale D-Fire dataset demonstrate that this approach enhances detection performance.Specifically,the proposed approach improves the mean average precision at an Intersection over Union(IoU)threshold of 0.5(mAP50)of the YOLOv8s,YOLOv10s,YOLOv8l,and YOLOv10lmodels by 0.26,0.21,0.84,and 0.63,respectively,compared tomodels trainedwith the default hyperparameters.The performance gains are more pronounced in larger models,YOLOv8l and YOLOv10l,than in their smaller counterparts,YOLOv8s and YOLOv10s.Furthermore,YOLOv8 models consistently outperform YOLOv10,with mAP50 improvements of 0.26 for YOLOv8s over YOLOv10s and 0.65 for YOLOv8l over YOLOv10l when trained with BT.These results establish YOLOv8 as the preferred model for fire detection applications where detection performance is prioritized.展开更多
Abnormalities of the gastrointestinal tract are widespread worldwide today.Generally,an effective way to diagnose these life-threatening diseases is based on endoscopy,which comprises a vast number of images.However,t...Abnormalities of the gastrointestinal tract are widespread worldwide today.Generally,an effective way to diagnose these life-threatening diseases is based on endoscopy,which comprises a vast number of images.However,the main challenge in this area is that the process is time-consuming and fatiguing for a gastroenterologist to examine every image in the set.Thus,this led to the rise of studies on designingAI-based systems to assist physicians in the diagnosis.In several medical imaging tasks,deep learning methods,especially convolutional neural networks(CNNs),have contributed to the stateof-the-art outcomes,where the complicated nonlinear relation between target classes and data can be learned and not limit to hand-crafted features.On the other hand,hyperparameters are commonly set manually,which may take a long time and leave the risk of non-optimal hyperparameters for classification.An effective tool for tuning optimal hyperparameters of deep CNNis Bayesian optimization.However,due to the complexity of the CNN,the network can be regarded as a black-box model where the information stored within it is hard to interpret.Hence,Explainable Artificial Intelligence(XAI)techniques are applied to overcome this issue by interpreting the decisions of the CNNs in such wise the physicians can trust.To play an essential role in real-time medical diagnosis,CNN-based models need to be accurate and interpretable,while the uncertainty must be handled.Therefore,a novel method comprising of three phases is proposed to classify these life-threatening diseases.At first,hyperparameter tuning is performed using Bayesian optimization for two state-of-the-art deep CNNs,and then Darknet53 and InceptionV3 features are extracted from these fine-tunned models.Secondly,XAI techniques are used to interpret which part of the images CNN takes for feature extraction.At last,the features are fused,and uncertainties are handled by selecting entropybased features.The experimental results show that the proposed method outperforms existing methods by achieving an accuracy of 97%based on a Bayesian optimized Support Vector Machine classifier.展开更多
Power transformers,as essential equipment for electricity transmission,may fail due to insulation degradation.Predicting the failure rate of power transformers precisely is beneficial to decision-making.Currently,unce...Power transformers,as essential equipment for electricity transmission,may fail due to insulation degradation.Predicting the failure rate of power transformers precisely is beneficial to decision-making.Currently,uncertainties of the prediction have not been deeply discussed.Besides,prediction accuracy is not high enough.This paper proposes a decomposition-based Bayesian deep learning(BDL)method to predict the failure rate of power transformers.Both the model uncertainty related to distribution of the model's weights and the inherent uncertainty associated with random noise can be captured by BDL.Uncertainties of prediction results are depicted with confidence intervals.Moreover,prediction accuracy is improved using variational mode decomposition(VMD).Numerical experiments have been carried out based on oil chromatographic data of power transformers from the Chongqing grid to validate effectiveness of the proposed method.展开更多
Smart healthcare integrates an advanced wave of information technology using smart devices to collect health-related medical science data.Such data usually exist in unstructured,noisy,incomplete,and heterogeneous form...Smart healthcare integrates an advanced wave of information technology using smart devices to collect health-related medical science data.Such data usually exist in unstructured,noisy,incomplete,and heterogeneous forms.Annotating these limitations remains an open challenge in deep learning to classify health conditions.In this paper,a long short-term memory(LSTM)based health condition prediction framework is proposed to rectify imbalanced and noisy data and transform it into a useful form to predict accurate health conditions.The imbalanced and scarce data is normalized through coding to gain consistency for accurate results using synthetic minority oversampling technique.The proposed model is optimized and ne-tuned in an end to end manner to select ideal parameters using tree parzen estimator to build a probabilistic model.The patient’s medication is pigeonholed to plot the diabetic condition’s risk factor through an algorithm to classify blood glucose metrics using a modern surveillance error grid method.The proposed model can efciently train,validate,and test noisy data by obtaining consistent results around 90%over the state of the art machine and deep learning techniques and overcoming the insufciency in training data through transfer learning.The overall results of the proposed model are further tested with secondary datasets to verify model sustainability.展开更多
Considering the recent developments in deep learning, it has become increasingly important to verify what methods are valid for the prediction of multivariate time-series data. In this study, we propose a novel method...Considering the recent developments in deep learning, it has become increasingly important to verify what methods are valid for the prediction of multivariate time-series data. In this study, we propose a novel method of time-series prediction employing multiple deep learners combined with a Bayesian network where training data is divided into clusters using K-means clustering. We decided how many clusters are the best for K-means with the Bayesian information criteria. Depending on each cluster, the multiple deep learners are trained. We used three types of deep learners: deep neural network (DNN), recurrent neural network (RNN), and long short-term memory (LSTM). A naive Bayes classifier is used to determine which deep learner is in charge of predicting a particular time-series. Our proposed method will be applied to a set of financial time-series data, the Nikkei Average Stock price, to assess the accuracy of the predictions made. Compared with the conventional method of employing a single deep learner to acquire all the data, it is demonstrated by our proposed method that F-value and accuracy are improved.展开更多
One example of an artificial intelligence ethical dilemma is the autonomous vehicle situation presented by Massachusetts Institute of Technology researchers in the Moral Machine Experiment.To solve such dilemmas,the M...One example of an artificial intelligence ethical dilemma is the autonomous vehicle situation presented by Massachusetts Institute of Technology researchers in the Moral Machine Experiment.To solve such dilemmas,the MIT researchers used a classic statistical method known as the hierarchical Bayesian(HB)model.This paper builds upon previous work for modeling moral decision making,applies a deep learning method to learn human ethics in this context,and compares it to the HB approach.These methods were tested to predict moral decisions of simulated populations of Moral Machine participants.Overall,test results indicate that deep neural networks can be effective in learning the group morality of a population through observation,and outperform the Bayesian model in the cases of model mismatches.展开更多
Sign language recognition can be treated as one of the efficient solu-tions for disabled people to communicate with others.It helps them to convey the required data by the use of sign language with no issues.The lates...Sign language recognition can be treated as one of the efficient solu-tions for disabled people to communicate with others.It helps them to convey the required data by the use of sign language with no issues.The latest develop-ments in computer vision and image processing techniques can be accurately uti-lized for the sign recognition process by disabled people.American Sign Language(ASL)detection was challenging because of the enhancing intraclass similarity and higher complexity.This article develops a new Bayesian Optimiza-tion with Deep Learning-Driven Hand Gesture Recognition Based Sign Language Communication(BODL-HGRSLC)for Disabled People.The BODL-HGRSLC technique aims to recognize the hand gestures for disabled people’s communica-tion.The presented BODL-HGRSLC technique integrates the concepts of compu-ter vision(CV)and DL models.In the presented BODL-HGRSLC technique,a deep convolutional neural network-based residual network(ResNet)model is applied for feature extraction.Besides,the presented BODL-HGRSLC model uses Bayesian optimization for the hyperparameter tuning process.At last,a bidir-ectional gated recurrent unit(BiGRU)model is exploited for the HGR procedure.A wide range of experiments was conducted to demonstrate the enhanced perfor-mance of the presented BODL-HGRSLC model.The comprehensive comparison study reported the improvements of the BODL-HGRSLC model over other DL models with maximum accuracy of 99.75%.展开更多
基金supported by the National Key Research and Development Program for Young Scientists of China(No.2022YFC3704000)the National Natural Science Foundation of China(No.42275122)the National Key Scientific and Technological Infrastructure project“Earth System Science Numerical Simulator Facility”(EarthLab).
文摘PM_(2.5)constitutes a complex and diversemixture that significantly impacts the environment,human health,and climate change.However,existing observation and numerical simulation techniques have limitations,such as a lack of data,high acquisition costs,andmultiple uncertainties.These limitations hinder the acquisition of comprehensive information on PM_(2.5)chemical composition and effectively implement refined air pollution protection and control strategies.In this study,we developed an optimal deep learning model to acquire hourly mass concentrations of key PM_(2.5)chemical components without complex chemical analysis.The model was trained using a randomly partitioned multivariate dataset arranged in chronological order,including atmospheric state indicators,which previous studies did not consider.Our results showed that the correlation coefficients of key chemical components were no less than 0.96,and the root mean square errors ranged from 0.20 to 2.11μg/m^(3)for the entire process(training and testing combined).The model accurately captured the temporal characteristics of key chemical components,outperforming typical machine-learning models,previous studies,and global reanalysis datasets(such asModern-Era Retrospective analysis for Research and Applications,Version 2(MERRA-2)and Copernicus Atmosphere Monitoring Service ReAnalysis(CAMSRA)).We also quantified the feature importance using the random forest model,which showed that PM_(2.5),PM_(1),visibility,and temperature were the most influential variables for key chemical components.In conclusion,this study presents a practical approach to accurately obtain chemical composition information that can contribute to filling missing data,improved air pollution monitoring and source identification.This approach has the potential to enhance air pollution control strategies and promote public health and environmental sustainability.
文摘This study aims to explore the application of Bayesian analysis based on neural networks and deep learning in data visualization.The research background is that with the increasing amount and complexity of data,traditional data analysis methods have been unable to meet the needs.Research methods include building neural networks and deep learning models,optimizing and improving them through Bayesian analysis,and applying them to the visualization of large-scale data sets.The results show that the neural network combined with Bayesian analysis and deep learning method can effectively improve the accuracy and efficiency of data visualization,and enhance the intuitiveness and depth of data interpretation.The significance of the research is that it provides a new solution for data visualization in the big data environment and helps to further promote the development and application of data science.
基金supported by the National Natural Science Foundation of China(Nos.61503014 and 61573043)。
文摘The reliability of an Engine Electronic Controller(EEC)attracts attention,which has a critical impact on aircraft engine safety.Reliability assessment is an important part of the design phase.However,the complex composition of EEC and the characteristic of the Phased-Mission System(PMS)lead to the difficulty of assessment.This paper puts forward an advanced approach,considering the complex products and uncertain mission profiles to evaluate the Mean Time Between Failures(MTBF)in the design phase.The failure mechanisms of complex components are deduced by Bayesian Deep Learning(BDL)intelligent algorithm.And copious samples of reliability simulation are solved by cloud computing technology.Based on the result of BDL and cloud computing,simulations are conducted with the Physics of Failure(Po F)theory and Failure Behavior Model(FBM).This reliability assessment approach can evaluate MTBF of electronic products without reference to physical tests.Finally,an EEC is applied to verify the effectiveness and accuracy of the method.
文摘In recent times,web intelligence(WI)has become a hot research topic,which utilizes Artificial Intelligence(AI)and advanced information technologies on theWeb and Internet.The users post reviews on social media and are employed for sentiment analysis(SA),which acts as feedback to business people and government.Proper SA on the reviews helps to enhance the quality of the services and products,however,web intelligence techniques are needed to raise the company profit and user fulfillment.With this motivation,this article introduces a new modified pigeon inspired optimization based feature selection(MPIO-FS)with Bayesian deep learning(BDL),named MPIOBDL model for SA on WI applications.The presented MPIO-BDL model initially involved preprocessing and feature extraction take place using Term Frequency—Inverse Document Frequency(TF-IDF)technique to derive a useful set of information from the user reviews.Besides,the MPIO-FS model is applied for the selection of optimal feature subsets,which helps to enhance classification accuracy and reduce computation complexity.Moreover,the BDL model is employed to allocate the proper class labels of the applied user review data.A comprehensive experimental results analysis highlighted the improved classification efficiency of the presented model.
基金supported by the MSIT(Ministry of Science and ICT),Republic of Korea,under the ITRC(Information Technology Research Center)Support Program(IITP-2024-RS-2022-00156354)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation)supported by the Technology Development Program(RS-2023-00264489)funded by the Ministry of SMEs and Startups(MSS,Republic of Korea).
文摘Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,have enabled automated fire detection in images and videos.Several deep learning models have been developed for object detection,including applications in fire and smoke detection.This study focuses on optimizing the training hyperparameters of YOLOv8 andYOLOv10models usingBayesianTuning(BT).Experimental results on the large-scale D-Fire dataset demonstrate that this approach enhances detection performance.Specifically,the proposed approach improves the mean average precision at an Intersection over Union(IoU)threshold of 0.5(mAP50)of the YOLOv8s,YOLOv10s,YOLOv8l,and YOLOv10lmodels by 0.26,0.21,0.84,and 0.63,respectively,compared tomodels trainedwith the default hyperparameters.The performance gains are more pronounced in larger models,YOLOv8l and YOLOv10l,than in their smaller counterparts,YOLOv8s and YOLOv10s.Furthermore,YOLOv8 models consistently outperform YOLOv10,with mAP50 improvements of 0.26 for YOLOv8s over YOLOv10s and 0.65 for YOLOv8l over YOLOv10l when trained with BT.These results establish YOLOv8 as the preferred model for fire detection applications where detection performance is prioritized.
基金This research was supported by the Universiti Malaya Impact-oriented Interdisciplinary Research Grant Programme(IIRG)-IIRG002C-19HWBUniversiti Malaya Covid-19 Related Special Research Grant(UMCSRG)CSRG008-2020ST and Partnership Grant(RK012-2019)from University of Malaya.
文摘Abnormalities of the gastrointestinal tract are widespread worldwide today.Generally,an effective way to diagnose these life-threatening diseases is based on endoscopy,which comprises a vast number of images.However,the main challenge in this area is that the process is time-consuming and fatiguing for a gastroenterologist to examine every image in the set.Thus,this led to the rise of studies on designingAI-based systems to assist physicians in the diagnosis.In several medical imaging tasks,deep learning methods,especially convolutional neural networks(CNNs),have contributed to the stateof-the-art outcomes,where the complicated nonlinear relation between target classes and data can be learned and not limit to hand-crafted features.On the other hand,hyperparameters are commonly set manually,which may take a long time and leave the risk of non-optimal hyperparameters for classification.An effective tool for tuning optimal hyperparameters of deep CNNis Bayesian optimization.However,due to the complexity of the CNN,the network can be regarded as a black-box model where the information stored within it is hard to interpret.Hence,Explainable Artificial Intelligence(XAI)techniques are applied to overcome this issue by interpreting the decisions of the CNNs in such wise the physicians can trust.To play an essential role in real-time medical diagnosis,CNN-based models need to be accurate and interpretable,while the uncertainty must be handled.Therefore,a novel method comprising of three phases is proposed to classify these life-threatening diseases.At first,hyperparameter tuning is performed using Bayesian optimization for two state-of-the-art deep CNNs,and then Darknet53 and InceptionV3 features are extracted from these fine-tunned models.Secondly,XAI techniques are used to interpret which part of the images CNN takes for feature extraction.At last,the features are fused,and uncertainties are handled by selecting entropybased features.The experimental results show that the proposed method outperforms existing methods by achieving an accuracy of 97%based on a Bayesian optimized Support Vector Machine classifier.
文摘Power transformers,as essential equipment for electricity transmission,may fail due to insulation degradation.Predicting the failure rate of power transformers precisely is beneficial to decision-making.Currently,uncertainties of the prediction have not been deeply discussed.Besides,prediction accuracy is not high enough.This paper proposes a decomposition-based Bayesian deep learning(BDL)method to predict the failure rate of power transformers.Both the model uncertainty related to distribution of the model's weights and the inherent uncertainty associated with random noise can be captured by BDL.Uncertainties of prediction results are depicted with confidence intervals.Moreover,prediction accuracy is improved using variational mode decomposition(VMD).Numerical experiments have been carried out based on oil chromatographic data of power transformers from the Chongqing grid to validate effectiveness of the proposed method.
基金supported by Researchers Supporting Project number(RSP2020/87),King Saud University,Riyadh,Saudi Arabia.
文摘Smart healthcare integrates an advanced wave of information technology using smart devices to collect health-related medical science data.Such data usually exist in unstructured,noisy,incomplete,and heterogeneous forms.Annotating these limitations remains an open challenge in deep learning to classify health conditions.In this paper,a long short-term memory(LSTM)based health condition prediction framework is proposed to rectify imbalanced and noisy data and transform it into a useful form to predict accurate health conditions.The imbalanced and scarce data is normalized through coding to gain consistency for accurate results using synthetic minority oversampling technique.The proposed model is optimized and ne-tuned in an end to end manner to select ideal parameters using tree parzen estimator to build a probabilistic model.The patient’s medication is pigeonholed to plot the diabetic condition’s risk factor through an algorithm to classify blood glucose metrics using a modern surveillance error grid method.The proposed model can efciently train,validate,and test noisy data by obtaining consistent results around 90%over the state of the art machine and deep learning techniques and overcoming the insufciency in training data through transfer learning.The overall results of the proposed model are further tested with secondary datasets to verify model sustainability.
文摘Considering the recent developments in deep learning, it has become increasingly important to verify what methods are valid for the prediction of multivariate time-series data. In this study, we propose a novel method of time-series prediction employing multiple deep learners combined with a Bayesian network where training data is divided into clusters using K-means clustering. We decided how many clusters are the best for K-means with the Bayesian information criteria. Depending on each cluster, the multiple deep learners are trained. We used three types of deep learners: deep neural network (DNN), recurrent neural network (RNN), and long short-term memory (LSTM). A naive Bayes classifier is used to determine which deep learner is in charge of predicting a particular time-series. Our proposed method will be applied to a set of financial time-series data, the Nikkei Average Stock price, to assess the accuracy of the predictions made. Compared with the conventional method of employing a single deep learner to acquire all the data, it is demonstrated by our proposed method that F-value and accuracy are improved.
文摘One example of an artificial intelligence ethical dilemma is the autonomous vehicle situation presented by Massachusetts Institute of Technology researchers in the Moral Machine Experiment.To solve such dilemmas,the MIT researchers used a classic statistical method known as the hierarchical Bayesian(HB)model.This paper builds upon previous work for modeling moral decision making,applies a deep learning method to learn human ethics in this context,and compares it to the HB approach.These methods were tested to predict moral decisions of simulated populations of Moral Machine participants.Overall,test results indicate that deep neural networks can be effective in learning the group morality of a population through observation,and outperform the Bayesian model in the cases of model mismatches.
基金The authors extend their appreciation to the King Salman centre for Disability Research for funding this work through Research Group no KSRG-2022-017.
文摘Sign language recognition can be treated as one of the efficient solu-tions for disabled people to communicate with others.It helps them to convey the required data by the use of sign language with no issues.The latest develop-ments in computer vision and image processing techniques can be accurately uti-lized for the sign recognition process by disabled people.American Sign Language(ASL)detection was challenging because of the enhancing intraclass similarity and higher complexity.This article develops a new Bayesian Optimiza-tion with Deep Learning-Driven Hand Gesture Recognition Based Sign Language Communication(BODL-HGRSLC)for Disabled People.The BODL-HGRSLC technique aims to recognize the hand gestures for disabled people’s communica-tion.The presented BODL-HGRSLC technique integrates the concepts of compu-ter vision(CV)and DL models.In the presented BODL-HGRSLC technique,a deep convolutional neural network-based residual network(ResNet)model is applied for feature extraction.Besides,the presented BODL-HGRSLC model uses Bayesian optimization for the hyperparameter tuning process.At last,a bidir-ectional gated recurrent unit(BiGRU)model is exploited for the HGR procedure.A wide range of experiments was conducted to demonstrate the enhanced perfor-mance of the presented BODL-HGRSLC model.The comprehensive comparison study reported the improvements of the BODL-HGRSLC model over other DL models with maximum accuracy of 99.75%.