The adoption of sustainable electronic healthcare infrastructure has revolutionized healthcare services and ensured that E-health technology caters efficiently and promptly to the needs of the stakeholders associated ...The adoption of sustainable electronic healthcare infrastructure has revolutionized healthcare services and ensured that E-health technology caters efficiently and promptly to the needs of the stakeholders associated with healthcare.Despite the phenomenal advancement in the present healthcare services,the major obstacle that mars the success of E-health is the issue of ensuring the confidentiality and privacy of the patients’data.A thorough scan of several research studies reveals that healthcare data continues to be the most sought after entity by cyber invaders.Various approaches and methods have been practiced by researchers to secure healthcare digital services.However,there are very few from the Machine learning(ML)domain even though the technique has the proactive ability to detect suspicious accesses against Electronic Health Records(EHRs).The main aim of this work is to conduct a systematic analysis of the existing research studies that address healthcare data confidentiality issues through ML approaches.B.A.Kitchenham guidelines have been practiced as a manual to conduct this work.Seven well-known digital libraries namely IEEE Xplore,Science Direct,Springer Link,ACM Digital Library,Willey Online Library,PubMed(Medical and Bio-Science),and MDPI have been included to performan exhaustive search for the existing pertinent studies.Results of this study depict that machine learning provides a more robust security mechanism for sustainable management of the EHR systems in a proactive fashion,yet the specified area has not been fully explored by the researchers.K-nearest neighbor algorithm and KNIEM implementation tools are mostly used to conduct experiments on EHR systems’log data.Accuracy and performance measure of practiced techniques are not sufficiently outlined in the primary studies.This research endeavour depicts that there is a need to analyze the dynamic digital healthcare environment more comprehensively.Greater accuracy and effective implementation of ML-based models are the need of the day for ensuring the confidentiality of EHRs in a proactive fashion.展开更多
Wireless sensor networks are a collection of intelligent sensor devices that are connected to one another and have the capability to exchange information packets amongst themselves.In recent years,this field of resear...Wireless sensor networks are a collection of intelligent sensor devices that are connected to one another and have the capability to exchange information packets amongst themselves.In recent years,this field of research has become increasingly popular due to the host of useful applications it can potentially serve.A deep analysis of the concepts associated with this domain reveals that the two main problems that are to be tackled here are throughput enhancement and network security improvement.The present article takes on one of these two issues namely the throughput enhancement.For the purpose of improving network productivity,a hybrid clustering based packet propagation protocol has been proposed.The protocol makes use of not only clustering mechanisms of machine learning but also utilizes the traditional forwarding function approach to arrive at an optimum model.The result of the simulation is a novel transmission protocol which significantly enhances network productivity and increases throughput value.展开更多
Vehicle detection is still challenging for intelligent transportation systems(ITS)to achieve satisfactory performance.The existing methods based on one stage and two-stage have intrinsic weakness in obtaining high veh...Vehicle detection is still challenging for intelligent transportation systems(ITS)to achieve satisfactory performance.The existing methods based on one stage and two-stage have intrinsic weakness in obtaining high vehicle detection performance.Due to advancements in detection technology,deep learning-based methods for vehicle detection have become more popular because of their higher detection accuracy and speed than the existing algorithms.This paper presents a robust vehicle detection technique based on Improved You Look Only Once(RVD-YOLOv5)to enhance vehicle detection accuracy.The proposed method works in three phases;in the first phase,the K-means algorithm performs data clustering on datasets to generate the classes of the objects.Subsequently,in the second phase,the YOLOv5 is applied to create the bounding box,and the Non-Maximum Suppression(NMS)technique is used to eliminate the overlapping of the bounding boxes of the vehicle.Then,the loss function CIoU is employed to obtain the accurate regression bounding box of the vehicle in the third phase.The simulation results show that the proposed method achieves better results when compared with other state-of-art techniques,namely LightweightDilated Convolutional Neural Network(LD-CNN),Single Shot Detector(SSD),YOLOv3 and YOLOv4 on the performance metric like precision,recall,mAP and F1-Score.The simulation and analysis are carried out on PASCAL VOC 2007,2012 and MS COCO 2017 datasets to obtain better performance for vehicle detection.Finally,the RVD-YOLOv5 obtains the results with an mAP of 98.6%and Precision,Recall,and F1-Score are 98%,96.2%and 97.09%,respectively.展开更多
The coronavirus,formerly known as COVID-19,has caused massive global disasters.As a precaution,most governments imposed quarantine periods ranging from months to years and postponed significantfinancial obligations.Furt...The coronavirus,formerly known as COVID-19,has caused massive global disasters.As a precaution,most governments imposed quarantine periods ranging from months to years and postponed significantfinancial obligations.Furthermore,governments around the world have used cutting-edge technologies to track citizens’activity.Thousands of sensors were connected to IoT(Internet of Things)devices to monitor the catastrophic eruption with billions of connected devices that use these novel tools and apps,privacy and security issues regarding data transmission and memory space abound.In this study,we suggest a block-chain-based methodology for safeguarding data in the billions of devices and sen-sors connected over the internet.Various trial secrecy and safety qualities are based on cutting-edge cryptography.To evaluate the proposed model,we recom-mend using an application of the system,a Raspberry Pi single-board computer in an IoT system,a laptop,a computer,cell phones and the Ethereum smart contract platform.The models ability to ensure safety,effectiveness and a suitable budget is proved by the Gowalla dataset results.展开更多
Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it...Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it initialize the parameters during the optimization process.There should be no variation in the cost function parameters at the global minimum.The momentum technique is a parameters optimization approach;however,it has difficulties stopping the parameter when the cost function value fulfills the global minimum(non-stop problem).Moreover,existing approaches use techniques;the learning rate is reduced during the iteration period.These techniques are monotonically reducing at a steady rate over time;our goal is to make the learning rate parameters.We present a method for determining the best parameters that adjust the learning rate in response to the cost function value.As a result,after the cost function has been optimized,the process of the rate Schedule is complete.This approach is shown to ensure convergence to the optimal parameters.This indicates that our strategy minimizes the cost function(or effective learning).The momentum approach is used in the proposed method.To solve the Momentum approach non-stop problem,we use the cost function of the parameter in our proposed method.As a result,this learning technique reduces the quantity of the parameter due to the impact of the cost function parameter.To verify that the learning works to test the strategy,we employed proof of convergence and empirical tests using current methods and the results are obtained using Python.展开更多
Suspicious mass traffic constantly evolves,making network behaviour tracing and structure more complex.Neural networks yield promising results by considering a sufficient number of processing elements with strong inte...Suspicious mass traffic constantly evolves,making network behaviour tracing and structure more complex.Neural networks yield promising results by considering a sufficient number of processing elements with strong interconnections between them.They offer efficient computational Hopfield neural networks models and optimization constraints used by undergoing a good amount of parallelism to yield optimal results.Artificial neural network(ANN)offers optimal solutions in classifying and clustering the various reels of data,and the results obtained purely depend on identifying a problem.In this research work,the design of optimized applications is presented in an organized manner.In addition,this research work examines theoretical approaches to achieving optimized results using ANN.It mainly focuses on designing rules.The optimizing design approach of neural networks analyzes the internal process of the neural networks.Practices in developing the network are based on the interconnections among the hidden nodes and their learning parameters.The methodology is proven best for nonlinear resource allocation problems with a suitable design and complex issues.The ANN proposed here considers more or less 46k nodes hidden inside 49 million connections employed on full-fledged parallel processors.The proposed ANN offered optimal results in real-world application problems,and the results were obtained using MATLAB.展开更多
基金This research was supported by Taif University Researchers Supporting Project under the Grant No.TURSP-2020/211,Taif University,Taif,Saudi Arabia。
文摘The adoption of sustainable electronic healthcare infrastructure has revolutionized healthcare services and ensured that E-health technology caters efficiently and promptly to the needs of the stakeholders associated with healthcare.Despite the phenomenal advancement in the present healthcare services,the major obstacle that mars the success of E-health is the issue of ensuring the confidentiality and privacy of the patients’data.A thorough scan of several research studies reveals that healthcare data continues to be the most sought after entity by cyber invaders.Various approaches and methods have been practiced by researchers to secure healthcare digital services.However,there are very few from the Machine learning(ML)domain even though the technique has the proactive ability to detect suspicious accesses against Electronic Health Records(EHRs).The main aim of this work is to conduct a systematic analysis of the existing research studies that address healthcare data confidentiality issues through ML approaches.B.A.Kitchenham guidelines have been practiced as a manual to conduct this work.Seven well-known digital libraries namely IEEE Xplore,Science Direct,Springer Link,ACM Digital Library,Willey Online Library,PubMed(Medical and Bio-Science),and MDPI have been included to performan exhaustive search for the existing pertinent studies.Results of this study depict that machine learning provides a more robust security mechanism for sustainable management of the EHR systems in a proactive fashion,yet the specified area has not been fully explored by the researchers.K-nearest neighbor algorithm and KNIEM implementation tools are mostly used to conduct experiments on EHR systems’log data.Accuracy and performance measure of practiced techniques are not sufficiently outlined in the primary studies.This research endeavour depicts that there is a need to analyze the dynamic digital healthcare environment more comprehensively.Greater accuracy and effective implementation of ML-based models are the need of the day for ensuring the confidentiality of EHRs in a proactive fashion.
文摘Wireless sensor networks are a collection of intelligent sensor devices that are connected to one another and have the capability to exchange information packets amongst themselves.In recent years,this field of research has become increasingly popular due to the host of useful applications it can potentially serve.A deep analysis of the concepts associated with this domain reveals that the two main problems that are to be tackled here are throughput enhancement and network security improvement.The present article takes on one of these two issues namely the throughput enhancement.For the purpose of improving network productivity,a hybrid clustering based packet propagation protocol has been proposed.The protocol makes use of not only clustering mechanisms of machine learning but also utilizes the traditional forwarding function approach to arrive at an optimum model.The result of the simulation is a novel transmission protocol which significantly enhances network productivity and increases throughput value.
文摘Vehicle detection is still challenging for intelligent transportation systems(ITS)to achieve satisfactory performance.The existing methods based on one stage and two-stage have intrinsic weakness in obtaining high vehicle detection performance.Due to advancements in detection technology,deep learning-based methods for vehicle detection have become more popular because of their higher detection accuracy and speed than the existing algorithms.This paper presents a robust vehicle detection technique based on Improved You Look Only Once(RVD-YOLOv5)to enhance vehicle detection accuracy.The proposed method works in three phases;in the first phase,the K-means algorithm performs data clustering on datasets to generate the classes of the objects.Subsequently,in the second phase,the YOLOv5 is applied to create the bounding box,and the Non-Maximum Suppression(NMS)technique is used to eliminate the overlapping of the bounding boxes of the vehicle.Then,the loss function CIoU is employed to obtain the accurate regression bounding box of the vehicle in the third phase.The simulation results show that the proposed method achieves better results when compared with other state-of-art techniques,namely LightweightDilated Convolutional Neural Network(LD-CNN),Single Shot Detector(SSD),YOLOv3 and YOLOv4 on the performance metric like precision,recall,mAP and F1-Score.The simulation and analysis are carried out on PASCAL VOC 2007,2012 and MS COCO 2017 datasets to obtain better performance for vehicle detection.Finally,the RVD-YOLOv5 obtains the results with an mAP of 98.6%and Precision,Recall,and F1-Score are 98%,96.2%and 97.09%,respectively.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022TR140)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The coronavirus,formerly known as COVID-19,has caused massive global disasters.As a precaution,most governments imposed quarantine periods ranging from months to years and postponed significantfinancial obligations.Furthermore,governments around the world have used cutting-edge technologies to track citizens’activity.Thousands of sensors were connected to IoT(Internet of Things)devices to monitor the catastrophic eruption with billions of connected devices that use these novel tools and apps,privacy and security issues regarding data transmission and memory space abound.In this study,we suggest a block-chain-based methodology for safeguarding data in the billions of devices and sen-sors connected over the internet.Various trial secrecy and safety qualities are based on cutting-edge cryptography.To evaluate the proposed model,we recom-mend using an application of the system,a Raspberry Pi single-board computer in an IoT system,a laptop,a computer,cell phones and the Ethereum smart contract platform.The models ability to ensure safety,effectiveness and a suitable budget is proved by the Gowalla dataset results.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R79),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it initialize the parameters during the optimization process.There should be no variation in the cost function parameters at the global minimum.The momentum technique is a parameters optimization approach;however,it has difficulties stopping the parameter when the cost function value fulfills the global minimum(non-stop problem).Moreover,existing approaches use techniques;the learning rate is reduced during the iteration period.These techniques are monotonically reducing at a steady rate over time;our goal is to make the learning rate parameters.We present a method for determining the best parameters that adjust the learning rate in response to the cost function value.As a result,after the cost function has been optimized,the process of the rate Schedule is complete.This approach is shown to ensure convergence to the optimal parameters.This indicates that our strategy minimizes the cost function(or effective learning).The momentum approach is used in the proposed method.To solve the Momentum approach non-stop problem,we use the cost function of the parameter in our proposed method.As a result,this learning technique reduces the quantity of the parameter due to the impact of the cost function parameter.To verify that the learning works to test the strategy,we employed proof of convergence and empirical tests using current methods and the results are obtained using Python.
基金This research is funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R 151)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Suspicious mass traffic constantly evolves,making network behaviour tracing and structure more complex.Neural networks yield promising results by considering a sufficient number of processing elements with strong interconnections between them.They offer efficient computational Hopfield neural networks models and optimization constraints used by undergoing a good amount of parallelism to yield optimal results.Artificial neural network(ANN)offers optimal solutions in classifying and clustering the various reels of data,and the results obtained purely depend on identifying a problem.In this research work,the design of optimized applications is presented in an organized manner.In addition,this research work examines theoretical approaches to achieving optimized results using ANN.It mainly focuses on designing rules.The optimizing design approach of neural networks analyzes the internal process of the neural networks.Practices in developing the network are based on the interconnections among the hidden nodes and their learning parameters.The methodology is proven best for nonlinear resource allocation problems with a suitable design and complex issues.The ANN proposed here considers more or less 46k nodes hidden inside 49 million connections employed on full-fledged parallel processors.The proposed ANN offered optimal results in real-world application problems,and the results were obtained using MATLAB.