期刊文献+
共找到130篇文章
< 1 2 7 >
每页显示 20 50 100
IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare 被引量:1
1
作者 Subrata Kumer Paul Abu Saleh Musa Miah +3 位作者 Rakhi Rani Paul Md.EkramulHamid Jungpil Shin Md Abdur Rahim 《Computers, Materials & Continua》 2025年第8期2513-2530,共18页
The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for he... The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for healthcare systems,particularly for identifying actions critical to patient well-being.However,challenges such as high computational demands,low accuracy,and limited adaptability persist in Human Motion Recognition(HMR).While some studies have integrated HMR with IoT for real-time healthcare applications,limited research has focused on recognizing MRHA as essential for effective patient monitoring.This study proposes a novel HMR method tailored for MRHA detection,leveraging multi-stage deep learning techniques integrated with IoT.The approach employs EfficientNet to extract optimized spatial features from skeleton frame sequences using seven Mobile Inverted Bottleneck Convolutions(MBConv)blocks,followed by Convolutional Long Short Term Memory(ConvLSTM)to capture spatio-temporal patterns.A classification module with global average pooling,a fully connected layer,and a dropout layer generates the final predictions.The model is evaluated on the NTU RGB+D 120 and HMDB51 datasets,focusing on MRHA such as sneezing,falling,walking,sitting,etc.It achieves 94.85%accuracy for cross-subject evaluations and 96.45%for cross-view evaluations on NTU RGB+D 120,along with 89.22%accuracy on HMDB51.Additionally,the system integrates IoT capabilities using a Raspberry Pi and GSM module,delivering real-time alerts via Twilios SMS service to caregivers and patients.This scalable and efficient solution bridges the gap between HMR and IoT,advancing patient monitoring,improving healthcare outcomes,and reducing costs. 展开更多
关键词 Real-time human motion recognition(HMR) ENConvLSTM EfficientNet ConvLSTM skeleton data NTU RGB+D 120 dataset MRHA
在线阅读 下载PDF
A Hybrid Machine Learning and Blockchain Framework for IoT DDoS Mitigation
2
作者 Singamaneni Krishnapriya Sukhvinder Singh 《Computer Modeling in Engineering & Sciences》 2025年第8期1849-1881,共33页
The explosive expansion of the Internet of Things(IoT)systems has increased the imperative to have strong and robust solutions to cyber Security,especially to curtail Distributed Denial of Service(DDoS)attacks,which c... The explosive expansion of the Internet of Things(IoT)systems has increased the imperative to have strong and robust solutions to cyber Security,especially to curtail Distributed Denial of Service(DDoS)attacks,which can cripple critical infrastructure.The proposed framework presented in the current paper is a new hybrid scheme that induces deep learning-based traffic classification and blockchain-enabledmitigation tomake intelligent,decentralized,and real-time DDoS countermeasures in an IoT network.The proposed model fuses the extracted deep features with statistical features and trains them by using traditional machine-learning algorithms,which makes them more accurate in detection than statistical features alone,based on the Convolutional Neural Network(CNN)architecture,which can extract deep features.A permissioned blockchain will be included to record the threat cases immutably and automatically execute mitigation measures through smart contracts to provide transparency and resilience.When tested on two test sets,BoT-IoT and IoT-23,the framework obtains a maximum F1-score at 97.5 percent and only a 1.8 percent false positive rate,which compares favorably to other solutions regarding effectiveness and the amount of time required to respond.Our findings support the feasibility of our method as an extensible and secure paradigm of nextgeneration IoT security,which has constrictive utility in mission-critical or resource-constrained settings.The work is a substantial milestone in autonomous and trustful mitigation against DDoS attacks through intelligent learning and decentralized enforcement. 展开更多
关键词 IoT security DDoS mitigation machine learning CNN random forest blockchain smart contracts cyberattack detection
在线阅读 下载PDF
Efficient Resource Management in IoT Network through ACOGA Algorithm
3
作者 Pravinkumar Bhujangrao Landge Yashpal Singh +1 位作者 Hitesh Mohapatra Seyyed Ahmad Edalatpanah 《Computer Modeling in Engineering & Sciences》 2025年第5期1661-1688,共28页
Internet of things networks often suffer from early node failures and short lifespan due to energy limits.Traditional routing methods are not enough.This work proposes a new hybrid algorithm called ACOGA.It combines A... Internet of things networks often suffer from early node failures and short lifespan due to energy limits.Traditional routing methods are not enough.This work proposes a new hybrid algorithm called ACOGA.It combines Ant Colony Optimization(ACO)and the Greedy Algorithm(GA).ACO finds smart paths while Greedy makes quick decisions.This improves energy use and performance.ACOGA outperforms Hybrid Energy-Efficient(HEE)and Adaptive Lossless Data Compression(ALDC)algorithms.After 500 rounds,only 5%of ACOGA’s nodes are dead,compared to 15%for HEE and 20%for ALDC.The network using ACOGA runs for 1200 rounds before the first nodes fail.HEE lasts 900 rounds and ALDC only 850.ACOGA saves at least 15%more energy by better distributing the load.It also achieves a 98%packet delivery rate.The method works well in mixed IoT networks like Smart Water Management Systems(SWMS).These systems have different power levels and communication ranges.The simulation of proposed model has been done in MATLAB simulator.The results show that that the proposed model outperform then the existing models. 展开更多
关键词 Energy management IoT networks ant colony optimization(ACO) greedy algorithm hybrid optimization routing algorithms energy efficiency network lifetime
在线阅读 下载PDF
Energy Efficient VM Selection Using CSOA-VM Model in Cloud Data Centers
4
作者 Mandeep Singh Devgan Tajinder Kumar +3 位作者 Purushottam Sharma Xiaochun Cheng Shashi Bhushan Vishal Garg 《CAAI Transactions on Intelligence Technology》 2025年第4期1217-1234,共18页
The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud d... The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud data centres and an important concern of research for many researchers.In this paper,we proposed a cuckoo search(CS)-based optimisation technique for the virtual machine(VM)selection and a novel placement algorithm considering the different constraints.The energy consumption model and the simulation model have been implemented for the efficient selection of VM.The proposed model CSOA-VM not only lessens the violations at the service level agreement(SLA)level but also minimises the VM migrations.The proposed model also saves energy and the performance analysis shows that energy consumption obtained is 1.35 kWh,SLA violation is 9.2 and VM migration is about 268.Thus,there is an improvement in energy consumption of about 1.8%and a 2.1%improvement(reduction)in violations of SLA in comparison to existing techniques. 展开更多
关键词 cloud computing cloud datacenter energy consumption VM selection
在线阅读 下载PDF
Energy Efficient Clustering and Sink Mobility Protocol Using Hybrid Golden Jackal and Improved Whale Optimization Algorithm for Improving Network Longevity in WSNs
5
作者 S B Lenin R Sugumar +2 位作者 J S Adeline Johnsana N Tamilarasan R Nathiya 《China Communications》 2025年第3期16-35,共20页
Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability... Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability.In this paper,Hybrid Golden Jackal,and Improved Whale Optimization Algorithm(HGJIWOA)is proposed as an effective and optimal routing protocol that guarantees efficient routing of data packets in the established between the CHs and the movable sink.This HGJIWOA included the phases of Dynamic Lens-Imaging Learning Strategy and Novel Update Rules for determining the reliable route essential for data packets broadcasting attained through fitness measure estimation-based CH selection.The process of CH selection achieved using Golden Jackal Optimization Algorithm(GJOA)completely depends on the factors of maintainability,consistency,trust,delay,and energy.The adopted GJOA algorithm play a dominant role in determining the optimal path of routing depending on the parameter of reduced delay and minimal distance.It further utilized Improved Whale Optimisation Algorithm(IWOA)for forwarding the data from chosen CHs to the BS via optimized route depending on the parameters of energy and distance.It also included a reliable route maintenance process that aids in deciding the selected route through which data need to be transmitted or re-routed.The simulation outcomes of the proposed HGJIWOA mechanism with different sensor nodes confirmed an improved mean throughput of 18.21%,sustained residual energy of 19.64%with minimized end-to-end delay of 21.82%,better than the competitive CH selection approaches. 展开更多
关键词 Cluster Heads(CHs) Golden Jackal Optimization Algorithm(GJOA) Improved Whale Optimization Algorithm(IWOA) unequal clustering
在线阅读 下载PDF
Plant Disease Detection and Classification Using Hybrid Model Based on Convolutional Auto Encoder and Convolutional Neural Network
6
作者 Tajinder Kumar Sarbjit Kaur +4 位作者 Purushottam Sharma Ankita Chhikara Xiaochun Cheng Sachin Lalar Vikram Verma 《Computers, Materials & Continua》 2025年第6期5219-5234,共16页
During its growth stage,the plant is exposed to various diseases.Detection and early detection of crop diseases is amajor challenge in the horticulture industry.Crop infections can harmtotal crop yield and reduce farm... During its growth stage,the plant is exposed to various diseases.Detection and early detection of crop diseases is amajor challenge in the horticulture industry.Crop infections can harmtotal crop yield and reduce farmers’income if not identified early.Today’s approved method involves a professional plant pathologist to diagnose the disease by visual inspection of the afflicted plant leaves.This is an excellent use case for Community Assessment and Treatment Services(CATS)due to the lengthy manual disease diagnosis process and the accuracy of identification is directly proportional to the skills of pathologists.An alternative to conventional Machine Learning(ML)methods,which require manual identification of parameters for exact results,is to develop a prototype that can be classified without pre-processing.To automatically diagnose tomato leaf disease,this research proposes a hybrid model using the Convolutional Auto-Encoders(CAE)network and the CNN-based deep learning architecture of DenseNet.To date,none of the modern systems described in this paper have a combined model based on DenseNet,CAE,and ConvolutionalNeuralNetwork(CNN)todiagnose the ailments of tomato leaves automatically.Themodelswere trained on a dataset obtained from the Plant Village repository.The dataset consisted of 9920 tomato leaves,and the model-tomodel accuracy ratio was 98.35%.Unlike other approaches discussed in this paper,this hybrid strategy requires fewer training components.Therefore,the training time to classify plant diseases with the trained algorithm,as well as the training time to automatically detect the ailments of tomato leaves,is significantly reduced. 展开更多
关键词 Tomato leaf disease deep learning DenseNet-121 convolutional autoencoder convolutional neural network
在线阅读 下载PDF
Advanced ECG Signal Analysis for Cardiovascular Disease Diagnosis Using AVOA Optimized Ensembled Deep Transfer Learning Approaches
7
作者 Amrutanshu Panigrahi Abhilash Pati +5 位作者 Bibhuprasad Sahu Ashis Kumar Pati Subrata Chowdhury Khursheed Aurangzeb Nadeem Javaid Sheraz Aslam 《Computers, Materials & Continua》 2025年第7期1633-1657,共25页
The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardio... The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardiovascular disease(CVD)diagnosis,but fluctuating signal patterns make classification challenging.Computer-assisted automated diagnostic tools that enhance ECG signal categorization using sophisticated algorithms and machine learning are helping healthcare practitioners manage greater patient populations.With this motivation,the study proposes a DL framework leveraging the PTB-XL ECG dataset to improve CVD diagnosis.Deep Transfer Learning(DTL)techniques extract features,followed by feature fusion to eliminate redundancy and retain the most informative features.Utilizing the African Vulture Optimization Algorithm(AVOA)for feature selection is more effective than the standard methods,as it offers an ideal balance between exploration and exploitation that results in an optimal set of features,improving classification performance while reducing redundancy.Various machine learning classifiers,including Support Vector Machine(SVM),eXtreme Gradient Boosting(XGBoost),Adaptive Boosting(AdaBoost),and Extreme Learning Machine(ELM),are used for further classification.Additionally,an ensemble model is developed to further improve accuracy.Experimental results demonstrate that the proposed model achieves the highest accuracy of 96.31%,highlighting its effectiveness in enhancing CVD diagnosis. 展开更多
关键词 Prognostics and health management(PHM) cardiovascular disease(CVD) electrocardiograms(ECGs) deep transfer learning(DTL) African vulture optimization algorithm(AVOA)
在线阅读 下载PDF
A Sine and Wormhole Energy Whale Optimization Algorithm for Optimal FACTS Placement in Uncertain Wind Integrated Scenario Based Power Systems
8
作者 Sunilkumar P.Agrawal Pradeep Jangir +4 位作者 Arpita Sundaram B.Pandya Anil Parmar Ahmad O.Hourani Bhargavi Indrajit Trivedi 《Journal of Bionic Engineering》 2025年第4期2115-2134,共20页
The Sine and Wormhole Energy Whale Optimization Algorithm(SWEWOA)represents an advanced solution method for resolving Optimal Power Flow(OPF)problems in power systems equipped with Flexible AC Transmission System(FACT... The Sine and Wormhole Energy Whale Optimization Algorithm(SWEWOA)represents an advanced solution method for resolving Optimal Power Flow(OPF)problems in power systems equipped with Flexible AC Transmission System(FACTS)devices which include Thyristor-Controlled Series Compensator(TCSC),Thyristor-Controlled Phase Shifter(TCPS),and Static Var Compensator(SVC).SWEWOA expands Whale Optimization Algorithm(WOA)through the integration of sine and wormhole energy features thus improving exploration and exploitation capabilities for efficient convergence in complex non-linear OPF problems.A performance evaluation of SWEWOA takes place on the IEEE-30 bus test system through static and dynamic loading scenarios where it demonstrates better results than five contemporary algorithms:Adaptive Chaotic WOA(ACWOA),WOA,Chaotic WOA(CWOA),Sine Cosine Algorithm Differential Evolution(SCADE),and Hybrid Grey Wolf Optimization(HGWO).The research shows that SWEWOA delivers superior generation cost reduction than other algorithms by reaching a minimum of 0.9%better performance.SWEWOA demonstrates superior power loss performance by achieving(P_(loss,min))at the lowest level compared to all other tested algorithms which leads to better system energy efficiency.The dynamic loading performance of SWEWOA leads to a 4.38%reduction in gross costs which proves its capability to handle different operating conditions.The algorithm achieves top performance in Friedman Rank Test(FRT)assessments through multiple performance metrics which verifies its consistent reliability and strong stability during changing power demands.The repeated simulations show that SWEWOA generates mean costs(C_(gen,min))and mean power loss values(P_(loss,min))with small deviations which indicate its capability to maintain cost-effective solutions in each simulation run.SWEWOA demonstrates great potential as an advanced optimization solution for power system operations through the results presented in this study. 展开更多
关键词 Sine and wormhole energy whale optimization algorithm(SWEWOA) Optimal power flow(OPF) Wind integration FACTS devices Power system optimization
在线阅读 下载PDF
Quantum-Resistant Cryptographic Primitives Using Modular Hash Learning Algorithms for Enhanced SCADA System Security
9
作者 Sunil K.Singh Sudhakar Kumar +5 位作者 Manraj Singh Savita Gupta Razaz Waheeb Attar Varsha Arya Ahmed Alhomoud Brij B.Gupta 《Computers, Materials & Continua》 2025年第8期3927-3941,共15页
As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)system... As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)systems.These systems are essential for monitoring and controlling industrial operations,making their security paramount.A key threat arises from Shor’s algorithm,a powerful quantum computing tool that can compromise current hash functions,leading to significant concerns about data integrity and confidentiality.To tackle these issues,this article introduces a novel Quantum-Resistant Hash Algorithm(QRHA)known as the Modular Hash Learning Algorithm(MHLA).This algorithm is meticulously crafted to withstand potential quantum attacks by incorporating advanced mathematical and algorithmic techniques,enhancing its overall security framework.Our research delves into the effectiveness ofMHLA in defending against both traditional and quantum-based threats,with a particular emphasis on its resilience to Shor’s algorithm.The findings from our study demonstrate that MHLA significantly enhances the security of SCADA systems in the context of quantum technology.By ensuring that sensitive data remains protected and confidential,MHLA not only fortifies individual systems but also contributes to the broader efforts of safeguarding industrial and infrastructure control systems against future quantumthreats.Our evaluation demonstrates that MHLA improves security by 38%against quantumattack simulations compared to traditional hash functionswhilemaintaining a computational efficiency ofO(m⋅n⋅k+v+n).The algorithm achieved a 98%success rate in detecting data tampering during integrity testing.These findings underline MHLA’s effectiveness in enhancing SCADA system security amidst evolving quantum technologies.This research represents a crucial step toward developing more secure cryptographic systems that can adapt to the rapidly changing technological landscape,ultimately ensuring the reliability and integrity of critical infrastructure in an era where quantum computing poses a growing risk. 展开更多
关键词 Hash functions post-quantum cryptography quantum-resistant hash functions network security supervisory control and data acquisition(SCADA)
在线阅读 下载PDF
Enhancing Healthcare Data Privacy in Cloud IoT Networks Using Anomaly Detection and Optimization with Explainable AI (ExAI)
10
作者 Jitendra Kumar Samriya Virendra Singh +4 位作者 Gourav Bathla Meena Malik Varsha Arya Wadee Alhalabi Brij B.Gupta 《Computers, Materials & Continua》 2025年第8期3893-3910,共18页
The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated cha... The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated challenges,such as data security,interoperability,and ethical concerns,is crucial to realizing the full potential of IoT in healthcare.Real-time anomaly detection plays a key role in protecting patient data and maintaining device integrity amidst the additional security risks posed by interconnected systems.In this context,this paper presents a novelmethod for healthcare data privacy analysis.The technique is based on the identification of anomalies in cloud-based Internet of Things(IoT)networks,and it is optimized using explainable artificial intelligence.For anomaly detection,the Radial Boltzmann Gaussian Temporal Fuzzy Network(RBGTFN)is used in the process of doing information privacy analysis for healthcare data.Remora Colony SwarmOptimization is then used to carry out the optimization of the network.The performance of the model in identifying anomalies across a variety of healthcare data is evaluated by an experimental study.This evaluation suggested that themodel measures the accuracy,precision,latency,Quality of Service(QoS),and scalability of themodel.A remarkable 95%precision,93%latency,89%quality of service,98%detection accuracy,and 96%scalability were obtained by the suggested model,as shown by the subsequent findings. 展开更多
关键词 Healthcare data privacy analysis anomaly detection cloud IoT network explainable artificial intelligence temporal fuzzy network
在线阅读 下载PDF
Cardiovascular Sound Classification Using Neural Architectures and Deep Learning for Advancing Cardiac Wellness
11
作者 Deepak Mahto Sudhakar Kumar +6 位作者 Sunil KSingh Amit Chhabra Irfan Ahmad Khan Varsha Arya Wadee Alhalabi Brij B.Gupta Bassma Saleh Alsulami 《Computer Modeling in Engineering & Sciences》 2025年第6期3743-3767,共25页
Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscul... Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscultation of cardiovascular sounds is heavily reliant on clinical expertise and subject to high variability.To counter this limitation,this study proposes an AI-driven classification system for cardiovascular sounds whereby deep learning techniques are engaged to automate the detection of an abnormal heartbeat.We employ FastAI vision-learner-based convolutional neural networks(CNNs)that include ResNet,DenseNet,VGG,ConvNeXt,SqueezeNet,and AlexNet to classify heart sound recordings.Instead of raw waveform analysis,the proposed approach transforms preprocessed cardiovascular audio signals into spectrograms,which are suited for capturing temporal and frequency-wise patterns.The models are trained on the PASCAL Cardiovascular Challenge dataset while taking into consideration the recording variations,noise levels,and acoustic distortions.To demonstrate generalization,external validation using Google’s Audio set Heartbeat Sound dataset was performed using a dataset rich in cardiovascular sounds.Comparative analysis revealed that DenseNet-201,ConvNext Large,and ResNet-152 could deliver superior performance to the other architectures,achieving an accuracy of 81.50%,a precision of 85.50%,and an F1-score of 84.50%.In the process,we performed statistical significance testing,such as the Wilcoxon signed-rank test,to validate performance improvements over traditional classification methods.Beyond the technical contributions,the research underscores clinical integration,outlining a pathway in which the proposed system can augment conventional electronic stethoscopes and telemedicine platforms in the AI-assisted diagnostic workflows.We also discuss in detail issues of computational efficiency,model interpretability,and ethical considerations,particularly concerning algorithmic bias stemming from imbalanced datasets and the need for real-time processing in clinical settings.The study describes a scalable,automated system combining deep learning,feature extraction using spectrograms,and external validation that can assist healthcare providers in the early and accurate detection of cardiovascular disease.AI-driven solutions can be viable in improving access,reducing delays in diagnosis,and ultimately even the continued global burden of heart disease. 展开更多
关键词 Healthy society cardiovascular system SPECTROGRAM FastAI audio signals computer vision neural network
在线阅读 下载PDF
Unveiling CyberFortis:A Unified Security Framework for IIoT-SCADA Systems with SiamDQN-AE FusionNet and PopHydra Optimizer
12
作者 Kuncham Sreenivasa Rao Rajitha Kotoju +4 位作者 B.Ramana Reddy Taher Al-Shehari Nasser A.Alsadhan Subhav Singh Shitharth Selvarajan 《Computers, Materials & Continua》 2025年第10期1899-1916,共18页
Protecting Supervisory Control and Data Acquisition-Industrial Internet of Things(SCADA-IIoT)systems against intruders has become essential since industrial control systems now oversee critical infrastructure,and cybe... Protecting Supervisory Control and Data Acquisition-Industrial Internet of Things(SCADA-IIoT)systems against intruders has become essential since industrial control systems now oversee critical infrastructure,and cyber attackers more frequently target these systems.Due to their connection of physical assets with digital networks,SCADA-IIoT systems face substantial risks from multiple attack types,including Distributed Denial of Service(DDoS),spoofing,and more advanced intrusion methods.Previous research in this field faces challenges due to insufficient solutions,as current intrusion detection systems lack the necessary accuracy,scalability,and adaptability needed for IIoT environments.This paper introduces CyberFortis,a novel cybersecurity framework aimed at detecting and preventing cyber threats in SCADA-IIoT systems.CyberFortis presents two key innovations:Firstly,Siamese Double Deep Q-Network with Autoencoders(Siamdqn-AE)FusionNet,which enhances intrusion detection by combining deep Q-Networks with autoencoders for improved attack detection and feature extraction;and secondly,the PopHydra Optimiser,an innovative solution to compute reinforcement learning discount factors for better model performance and convergence.This method combines Siamese deep Q-Networks with autoencoders to create a system that can detect different types of attacks more effectively and adapt to new challenges.CyberFortis is better than current top attack detection systems,showing higher scores in important areas like accuracy,precision,recall,and F1-score,based on data from CICIoT 2023,UNSW-NB 15,and WUSTL-IIoT datasets.Results from the proposed framework show a 97.5%accuracy rate,indicating its potential as an effective solution for SCADA-IIoT cybersecurity against emerging threats.The research confirms that the proposed security and resilience methods are successful in protecting vital industrial control systems within their operational environments. 展开更多
关键词 Industrial Internet of Things(IIoT) SCADA systems SECURITY intrusion detection system(IDS) optimization deep learning
在线阅读 下载PDF
Blockchain-Based Secured IPFS-Enable Event Storage Technique With Authentication Protocol in VANET 被引量:6
13
作者 Sanjeev Kumar Dwivedi Ruhul Amin Satyanarayana Vollala 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2021年第12期1913-1922,共10页
In recent decades,intelligent transportation systems(ITS)have improved drivers’safety and have shared information(such as traffic congestion and accidents)in a very efficient way.However,the privacy of vehicles and t... In recent decades,intelligent transportation systems(ITS)have improved drivers’safety and have shared information(such as traffic congestion and accidents)in a very efficient way.However,the privacy of vehicles and the security of event information is a major concern.The problem of secure sharing of event information without compromising the trusted third party(TTP)and data storage is the main issue in ITS.Blockchain technologies can resolve this problem.A work has been published on blockchain-based protocol for secure sharing of events and authentication of vehicles.This protocol addresses the issue of the safe storing of event information.However,authentication of vehicles solely depends on the cloud server.As a result,their scheme utilizes the notion of partially decentralized architecture.This paper proposes a novel decentralized architecture for the vehicular ad-hoc network(VANET)without the cloud server.This work also presents a protocol for securing event information and vehicle authentication using the blockchain mechanism.In this protocol,the registered user accesses the event information securely from the interplanetary file system(IPFS).We incorporate the IPFS,along with blockchain,to store the information in a fully distributed manner.The proposed protocol is compared with the state-of-the-art.The comparison provides desirable security at a reasonable cost.The evaluation of the proposed smart contract in terms of cost(GAS)is also discussed. 展开更多
关键词 AUTHENTICATION blockchain interplanetary file system(IPFS) secure information sharing security
在线阅读 下载PDF
Enabling Energy Efficient Sensory Data Collection Using Multiple Mobile Sink 被引量:3
14
作者 Madhumathy P Sivakumar D 《China Communications》 SCIE CSCD 2014年第10期29-37,共9页
Mobile sink is the challenging task for wireless sensor networks(WSNs).In this paper we propose to design an efficient routing protocol for single mobile sink and multiple mobile sink for data gathering in WSN.In this... Mobile sink is the challenging task for wireless sensor networks(WSNs).In this paper we propose to design an efficient routing protocol for single mobile sink and multiple mobile sink for data gathering in WSN.In this process,a biased random walk method is used to determine the next position of the sink.Then,a rendezvous point selection with splitting tree technique is used to find the optimal data transmission path.If the sink moves within the range of the rendezvous point,it receives the gathered data and if moved out,it selects a relay node from its neighbours to relay packets from rendezvous point to the sink.Proposed algorithm reduces the signal overhead and improves the triangular routing problem.Here the sink acts as a vehicle and collect the data from the sensor.The results show that the proposed model effectively supports sink mobility with low overhead and delay when compared with Intelligent Agent-based Routing protocol(IAR) and also increases the reliability and delivery ratio when the number of sources increases. 展开更多
关键词 sink mobility data gathering rendezvous point biased random walk andwireless sensor network
在线阅读 下载PDF
Hybrid XGBoost model with hyperparameter tuning for prediction of liver disease with better accuracy 被引量:2
15
作者 Surjeet Dalal Edeh Michael Onyema Amit Malik 《World Journal of Gastroenterology》 SCIE CAS 2022年第46期6551-6563,共13页
BACKGROUND Liver disease indicates any pathology that can harm or destroy the liver or prevent it from normal functioning.The global community has recently witnessed an increase in the mortality rate due to liver dise... BACKGROUND Liver disease indicates any pathology that can harm or destroy the liver or prevent it from normal functioning.The global community has recently witnessed an increase in the mortality rate due to liver disease.This could be attributed to many factors,among which are human habits,awareness issues,poor healthcare,and late detection.To curb the growing threats from liver disease,early detection is critical to help reduce the risks and improve treatment outcome.Emerging technologies such as machine learning,as shown in this study,could be deployed to assist in enhancing its prediction and treatment.AIM To present a more efficient system for timely prediction of liver disease using a hybrid eXtreme Gradient Boosting model with hyperparameter tuning with a view to assist in early detection,diagnosis,and reduction of risks and mortality associated with the disease.METHODS The dataset used in this study consisted of 416 people with liver problems and 167 with no such history.The data were collected from the state of Andhra Pradesh,India,through https://www.kaggle.com/datasets/uciml/indian-liver-patientrecords.The population was divided into two sets depending on the disease state of the patient.This binary information was recorded in the attribute"is_patient".RESULTS The results indicated that the chi-square automated interaction detection and classification and regression trees models achieved an accuracy level of 71.36%and 73.24%,respectively,which was much better than the conventional method.The proposed solution would assist patients and physicians in tackling the problem of liver disease and ensuring that cases are detected early to prevent it from developing into cirrhosis(scarring)and to enhance the survival of patients.The study showed the potential of machine learning in health care,especially as it concerns disease prediction and monitoring.CONCLUSION This study contributed to the knowledge of machine learning application to health and to the efforts toward combating the problem of liver disease.However,relevant authorities have to invest more into machine learning research and other health technologies to maximize their potential. 展开更多
关键词 Liver infection Machine learning Chi-square automated interaction detection Classification and regression trees Decision tree XGBoost Hyperparameter tuning
在线阅读 下载PDF
Enhanced Clustering Based OSN Privacy Preservation to Ensure k-Anonymity, t-Closeness, l-Diversity, and Balanced Privacy Utility 被引量:3
16
作者 Rupali Gangarde Amit Sharma Ambika Pawar 《Computers, Materials & Continua》 SCIE EI 2023年第4期2171-2190,共20页
Online Social Networks (OSN) sites allow end-users to share agreat deal of information, which may also contain sensitive information,that may be subject to commercial or non-commercial privacy attacks. Asa result, gua... Online Social Networks (OSN) sites allow end-users to share agreat deal of information, which may also contain sensitive information,that may be subject to commercial or non-commercial privacy attacks. Asa result, guaranteeing various levels of privacy is critical while publishingdata by OSNs. The clustering-based solutions proved an effective mechanismto achieve the privacy notions in OSNs. But fixed clustering limits theperformance and scalability. Data utility degrades with increased privacy,so balancing the privacy utility trade-off is an open research issue. Theresearch has proposed a novel privacy preservation model using the enhancedclustering mechanism to overcome this issue. The proposed model includesphases like pre-processing, enhanced clustering, and ensuring privacy preservation.The enhanced clustering algorithm is the second phase where authorsmodified the existing fixed k-means clustering using the threshold approach.The threshold value is determined based on the supplied OSN data of edges,nodes, and user attributes. Clusters are k-anonymized with multiple graphproperties by a novel one-pass algorithm. After achieving the k-anonymityof clusters, optimization was performed to achieve all privacy models, suchas k-anonymity, t-closeness, and l-diversity. The proposed privacy frameworkachieves privacy of all three network components, i.e., link, node, and userattributes, with improved utility. The authors compare the proposed techniqueto underlying methods using OSN Yelp and Facebook datasets. The proposedapproach outperformed the underlying state of art methods for Degree ofAnonymization, computational efficiency, and information loss. 展开更多
关键词 Enhanced clustering online social network K-ANONYMITY t-closeness l-diversity privacy preservation
在线阅读 下载PDF
An Attention Based Neural Architecture for Arrhythmia Detection and Classification from ECG Signals 被引量:2
17
作者 Nimmala Mangathayaru Padmaja Rani +4 位作者 Vinjamuri Janaki Kalyanapu Srinivas B.Mathura Bai G.Sai Mohan BLalith Bharadwaj 《Computers, Materials & Continua》 SCIE EI 2021年第11期2425-2443,共19页
Arrhythmia is ubiquitous worldwide and cardiologists tend to provide solutions from the recent advancements in medicine.Detecting arrhythmia from ECG signals is considered a standard approach and hence,automating this... Arrhythmia is ubiquitous worldwide and cardiologists tend to provide solutions from the recent advancements in medicine.Detecting arrhythmia from ECG signals is considered a standard approach and hence,automating this process would aid the diagnosis by providing fast,costefficient,and accurate solutions at scale.This is executed by extracting the definite properties from the individual patterns collected from Electrocardiography(ECG)signals causing arrhythmia.In this era of applied intelligence,automated detection and diagnostic solutions are widely used for their spontaneous and robust solutions.In this research,our contributions are two-fold.Firstly,the Dual-Tree Complex Wavelet Transform(DT-CWT)method is implied to overhaul shift-invariance and aids signal reconstruction to extract significant features.Next,A neural attention mechanism is implied to capture temporal patterns from the extracted features of the ECG signal to discriminate distinct classes of arrhythmia and is trained end-to-end with the finest parameters.To ensure that the model’s generalizability,a set of five traintest variants are implied.The proposed model attains the highest accuracy of 98.5%for classifying 8 variants of arrhythmia on the MIT-BIH dataset.To test the resilience of the model,the unseen(test)samples are increased by 5x and the deviation in accuracy score and MSE was 0.12%and 0.1%respectively.Further,to assess the diagnostic model performance,AUC-ROC curves are plotted.At every test level,the proposed model is capable of generalizing new samples and leverages the advantage to develop a real-world application.As a note,this research is the first attempt to provide neural attention in arrhythmia classification using MIT-BIH ECG signals data with state-of-the-art performance. 展开更多
关键词 Arrhythmia classification arrhythmia detection MIT-BIH dataset dual-tree complex wave transform ECG classification neural attention neural networks deep learning
在线阅读 下载PDF
Fuzzy least brain storm optimization and entropy-based Euclidean distance for multimodal vein-based recognition system 被引量:1
18
作者 Dipti Verma Sipi Dubey 《Journal of Central South University》 SCIE EI CAS CSCD 2017年第10期2360-2371,共12页
Nowadays, the vein based recognition system becomes an emerging and facilitating biometric technology in the recognition system. Vein recognition exploits the different modalities such as finger, palm and hand image f... Nowadays, the vein based recognition system becomes an emerging and facilitating biometric technology in the recognition system. Vein recognition exploits the different modalities such as finger, palm and hand image for the person identification. In this work, the fuzzy least brain storm optimization and Euclidean distance(EED) are proposed for the vein based recognition system. Initially, the input image is fed into the region of interest(ROI) extraction which obtains the appropriate image for the subsequent step. Then, features or vein pattern is extracted by the image enlightening, circular averaging filter and holoentropy based thresholding. After the features are obtained, the entropy based Euclidean distance is proposed to fuse the features by the score level fusion with the weight score value. Finally, the optimal matching score is computed iteratively by the newly developed fuzzy least brain storm optimization(FLBSO) algorithm. The novel algorithm is developed by the least mean square(LMS) algorithm and fuzzy brain storm optimization(FBSO). Thus, the experimental results are evaluated and the performance is compared with the existing systems using false acceptance rate(FAR), false rejection rate(FRR) and accuracy. The performance outcome of the proposed algorithm attains the higher accuracy of 89.9% which ensures the better recognition rate. 展开更多
关键词 MULTIMODALITY BRAIN STORM OPTIMIZATION (BSO) least mean square (LMS) score level fusion recognition
在线阅读 下载PDF
S&P BSE Sensex and S&P BSE IT return forecasting using ARIMA 被引量:5
19
作者 Madhavi Latha Challa Venkataramanaiah Malepati Siva Nageswara Rao Kolusu 《Financial Innovation》 2020年第1期793-811,共19页
This study forecasts the return and volatility dynamics of S&P BSE Sensex and S&P BSE IT indices of the Bombay Stock Exchange.To achieve the objectives,the study uses descriptive statistics;tests including var... This study forecasts the return and volatility dynamics of S&P BSE Sensex and S&P BSE IT indices of the Bombay Stock Exchange.To achieve the objectives,the study uses descriptive statistics;tests including variance ratio,Augmented Dickey-Fuller,Phillips-Perron,and Kwiatkowski Phillips Schmidt and Shin;and Autoregressive Integrated Moving Average(ARIMA).The analysis forecasts daily stock returns for the S&P BSE Sensex and S&P BSE IT time series,using the ARIMA model.The results reveal that the mean returns of both indices are positive but near zero.This is indicative of a regressive tendency in the longterm.The forecasted values of S&P BSE Sensex and S&P BSE IT are almost equal to their actual values,with few deviations.Hence,the ARIMA model is capable of predicting medium-or long-term horizons using historical values of S&P BSE Sensex and S&P BSE IT. 展开更多
关键词 Efficient market hypothesis Bombay stock exchange ARIMA KPSS S&P BSE Sensex Forecasting S&P BSE IT
在线阅读 下载PDF
High Throughput Scheduling Algorithms for Input Queued Packet Switches 被引量:4
20
作者 R.Chithra Devi D.Jemi Florinabel Narayanan Prasanth 《Computers, Materials & Continua》 SCIE EI 2022年第1期1527-1540,共14页
The high-performance computing paradigm needs high-speed switching fabrics to meet the heavy traffic generated by their applications.These switching fabrics are efficiently driven by the deployed scheduling algorithms... The high-performance computing paradigm needs high-speed switching fabrics to meet the heavy traffic generated by their applications.These switching fabrics are efficiently driven by the deployed scheduling algorithms.In this paper,we proposed two scheduling algorithms for input queued switches whose operations are based on ranking procedures.At first,we proposed a Simple 2-Bit(S2B)scheme which uses binary ranking procedure and queue size for scheduling the packets.Here,the Virtual Output Queue(VOQ)set with maximum number of empty queues receives higher rank than other VOQ’s.Through simulation,we showed S2B has better throughput performance than Highest Ranking First(HRF)arbitration under uniform,and non-uniform traffic patterns.To further improve the throughput-delay performance,an Enhanced 2-Bit(E2B)approach is proposed.This approach adopts an integer representation for rank,which is the number of empty queues in a VOQ set.The simulation result shows E2B outperforms S2B and HRF scheduling algorithms with maximum throughput-delay performance.Furthermore,the algorithms are simulated under hotspot traffic and E2B proves to be more efficient. 展开更多
关键词 Crossbar switch input queued switch virtual output queue scheduling algorithm high performance computing
在线阅读 下载PDF
上一页 1 2 7 下一页 到第
使用帮助 返回顶部