期刊文献+
共找到416篇文章
< 1 2 21 >
每页显示 20 50 100
Lightweight Ontology Architecture for QoS Aware Service Discovery and Semantic CoAP Data Management in Heterogeneous IoT Environment
1
作者 Suman Sukhavasi Thinagaran Perumal +1 位作者 Norwati Mustapha Razali Yaakob 《Computers, Materials & Continua》 2026年第5期1492-1523,共32页
The Internet of Things(IoT)ecosystem is inherently heterogeneous,comprising diverse devices that must interoperate seamlessly to enable federated message and data exchange.However,as the number of service requests gro... The Internet of Things(IoT)ecosystem is inherently heterogeneous,comprising diverse devices that must interoperate seamlessly to enable federated message and data exchange.However,as the number of service requests grows,existing approaches suffer from increased discovery time and degraded Quality of Service(QoS).Moreover,the massive data generated by heterogeneous IoT devices often contain redundancy and noise,posing challenges to efficient data management.To address these issues,this paper proposes a lightweight ontology-based architecture that enhances service discovery and QoS-aware semantic data management.The architecture employs Modified-Ordered Points to Identify theClustering Structure(M-OPTICS)to cluster and eliminate redundant IoT data.The clustered data are then modelled into a lightweight ontology,enabling semantic relationship inference and rule generation through an embedded inference engine.User requests,transmitted via theConstrainedApplication Protocol(CoAP),are semantically enriched and matched to QoS parameters using Dynamic Shannon Entropy optimized with the Salp Swarm Algorithm.Semantic matching is further refined using a bidirectional recurrent neural network(Bi-RNN),while a State–Action–Reward–State–Action(SARSA)reinforcement learning model dynamically defines and updates semantic rules to retrieve themost recent and relevant data across heterogeneous devices.Experimental results demonstrate that the proposed architecture outperforms existing methods in terms of response time,service delay,execution time,precision,recall,and F-score under varying CoAP request loads and communication overheads.The results confirm the effectiveness of the proposed lightweight ontology architecture for service discovery and data management in heterogeneous IoT environments. 展开更多
关键词 CoAP protocol Internet of Things INTEROPERABILITY lightweight ontology service discovery
在线阅读 下载PDF
HCL Net: Deep Learning for Accurate Classification of Honeycombing Lung and Ground Glass Opacity in CT Images
2
作者 Hairul Aysa Abdul Halim Sithiq Liyana Shuib +1 位作者 Muneer Ahmad Chermaine Deepa Antony 《Computers, Materials & Continua》 2026年第1期999-1023,共25页
Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal... Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal lung tissue,honeycombing lungs,and Ground Glass Opacity(GGO)in CT images is often challenging for radiologists and may lead to misinterpretations.Although earlier studies have proposed models to detect and classify HCL,many faced limitations such as high computational demands,lower accuracy,and difficulty distinguishing between HCL and GGO.CT images are highly effective for lung classification due to their high resolution,3D visualization,and sensitivity to tissue density variations.This study introduces Honeycombing Lungs Network(HCL Net),a novel classification algorithm inspired by ResNet50V2 and enhanced to overcome the shortcomings of previous approaches.HCL Net incorporates additional residual blocks,refined preprocessing techniques,and selective parameter tuning to improve classification performance.The dataset,sourced from the University Malaya Medical Centre(UMMC)and verified by expert radiologists,consists of CT images of normal,honeycombing,and GGO lungs.Experimental evaluations across five assessments demonstrated that HCL Net achieved an outstanding classification accuracy of approximately 99.97%.It also recorded strong performance in other metrics,achieving 93%precision,100%sensitivity,89%specificity,and an AUC-ROC score of 97%.Comparative analysis with baseline feature engineering methods confirmed the superior efficacy of HCL Net.The model significantly reduces misclassification,particularly between honeycombing and GGO lungs,enhancing diagnostic precision and reliability in lung image analysis. 展开更多
关键词 Deep learning honeycombing lung ground glass opacity Resnet50v2 multiclass classification
在线阅读 下载PDF
Quantum Secure Multiparty Computation:Bridging Privacy,Security,and Scalability in the Post-Quantum Era
3
作者 Sghaier Guizani Tehseen Mazhar Habib Hamam 《Computers, Materials & Continua》 2026年第4期1-25,共25页
The advent of quantum computing poses a significant challenge to traditional cryptographic protocols,particularly those used in SecureMultiparty Computation(MPC),a fundamental cryptographic primitive for privacypreser... The advent of quantum computing poses a significant challenge to traditional cryptographic protocols,particularly those used in SecureMultiparty Computation(MPC),a fundamental cryptographic primitive for privacypreserving computation.Classical MPC relies on cryptographic techniques such as homomorphic encryption,secret sharing,and oblivious transfer,which may become vulnerable in the post-quantum era due to the computational power of quantum adversaries.This study presents a review of 140 peer-reviewed articles published between 2000 and 2025 that used different databases like MDPI,IEEE Explore,Springer,and Elsevier,examining the applications,types,and security issues with the solution of Quantum computing in different fields.This review explores the impact of quantum computing on MPC security,assesses emerging quantum-resistant MPC protocols,and examines hybrid classicalquantum approaches aimed at mitigating quantum threats.We analyze the role of Quantum Key Distribution(QKD),post-quantum cryptography(PQC),and quantum homomorphic encryption in securing multiparty computations.Additionally,we discuss the challenges of scalability,computational efficiency,and practical deployment of quantumsecure MPC frameworks in real-world applications such as privacy-preserving AI,secure blockchain transactions,and confidential data analysis.This review provides insights into the future research directions and open challenges in ensuring secure,scalable,and quantum-resistant multiparty computation. 展开更多
关键词 Quantum computing secure multiparty computation(MPC) post-quantum cryptography(PQC) quantum key distribution(QKD) privacy-preserving computation quantum homomorphic encryption quantum network security federated learning blockchain security quantum cryptography
在线阅读 下载PDF
Security and Privacy Challenges,Solutions,and Performance Evaluation in AIoT-Enabled Smart Societies
4
作者 Shahab Ali Khan Tehseen Mazhar +5 位作者 Syed Faisal Abbas Shah Wasim Ahmad Sunawar Khan Afsha BiBi Usama Shah Habib Hamam 《Computer Modeling in Engineering & Sciences》 2026年第3期179-217,共39页
The convergence of Artificial Intelligence(AI)and the Internet of Things(IoT)has enabled Artificial Intelligence of Things(AIoT)systems that support intelligent and responsive smart societies,but it also introduces ma... The convergence of Artificial Intelligence(AI)and the Internet of Things(IoT)has enabled Artificial Intelligence of Things(AIoT)systems that support intelligent and responsive smart societies,but it also introduces major security and privacy concerns across domains such as healthcare,transportation,and smart cities.This Systemic Literature Review(SLR)addresses three research questions:identifying major threats and challenges in AIoT ecosystems,reviewing state-of-the-art security and privacy techniques,and evaluating their effectiveness.An SLR covering the period from 2020 to 2025 was conducted using major academic digital libraries,including IEEE Xplore,ACM Digital Library,ScienceDirect,SpringerLink,and Wiley Online Library,with a focus on security-and privacy-enhancing techniques such as blockchain,federated learning,and edge AI.The SLR identifies key challenges including data privacy leakage,authentication,cloud dependency,and attack surface expansion,and finds that emerging techniques,while promising,often involve trade-offs related to latency,scalability,and compliance.The study highlights future directions including lightweight cryptography,standardization,and explainable AI to support secure and trustworthy AIoT-enabled smart societies. 展开更多
关键词 Artificial Intelligence of Things(AIoT) smart societies security PRIVACY blockchain federated learning edge computing
在线阅读 下载PDF
Big Data-Driven Federated Learning Model for Scalable and Privacy-Preserving Cyber Threat Detection in IoT-Enabled Healthcare Systems
5
作者 Noura Mohammed Alaskar Muzammil Hussain +3 位作者 Saif Jasim Almheiri Atta-ur-Rahman Adnan Khan Khan M.Adnan 《Computers, Materials & Continua》 2026年第4期793-816,共24页
The increasing number of interconnected devices and the incorporation of smart technology into contemporary healthcare systems have significantly raised the attack surface of cyber threats.The early detection of threa... The increasing number of interconnected devices and the incorporation of smart technology into contemporary healthcare systems have significantly raised the attack surface of cyber threats.The early detection of threats is both necessary and complex,yet these interconnected healthcare settings generate enormous amounts of heterogeneous data.Traditional Intrusion Detection Systems(IDS),which are generally centralized and machine learning-based,often fail to address the rapidly changing nature of cyberattacks and are challenged by ethical concerns related to patient data privacy.Moreover,traditional AI-driven IDS usually face challenges in handling large-scale,heterogeneous healthcare data while ensuring data privacy and operational efficiency.To address these issues,emerging technologies such as Big Data Analytics(BDA)and Federated Learning(FL)provide a hybrid framework for scalable,adaptive intrusion detection in IoT-driven healthcare systems.Big data techniques enable processing large-scale,highdimensional healthcare data,and FL can be used to train a model in a decentralized manner without transferring raw data,thereby maintaining privacy between institutions.This research proposes a privacy-preserving Federated Learning–based model that efficiently detects cyber threats in connected healthcare systems while ensuring distributed big data processing,privacy,and compliance with ethical regulations.To strengthen the reliability of the reported findings,the resultswere validated using cross-dataset testing and 95%confidence intervals derived frombootstrap analysis,confirming consistent performance across heterogeneous healthcare data distributions.This solution takes a significant step toward securing next-generation healthcare infrastructure by combining scalability,privacy,adaptability,and earlydetection capabilities.The proposed global model achieves a test accuracy of 99.93%±0.03(95%CI)and amiss-rate of only 0.07%±0.02,representing state-of-the-art performance in privacy-preserving intrusion detection.The proposed FL-driven IDS framework offers an efficient,privacy-preserving,and scalable solution for securing next-generation healthcare infrastructures by combining adaptability,early detection,and ethical data management. 展开更多
关键词 Intrusion detection systems cyber threat detection explainable AI big data analytics federated learning
在线阅读 下载PDF
Federated Deep Learning in Intelligent Urban Ecosystems:A Systematic Review of Advancements and Applications in Smart Cities,Homes,Buildings,and Healthcare Systems
6
作者 Muhammad Adnan Tariq Sunawar Khan +5 位作者 Tehseen Mazhar Tariq Shahzad Sahar Arooj Khmaies Ouahada Muhammad Adnan Khan Habib Hamam 《Computer Modeling in Engineering & Sciences》 2026年第3期218-267,共50页
The contemporary smart cities,smart homes,smart buildings,and smart health care systems are the results of the explosive growth of Internet of Things(IoT)devices and deep learning.Yet the centralized training paradigm... The contemporary smart cities,smart homes,smart buildings,and smart health care systems are the results of the explosive growth of Internet of Things(IoT)devices and deep learning.Yet the centralized training paradigms have fundamental issues in data privacy,regulatory compliance,and ownership silo alongside the scaled limitations of the real-life application.The concept of Federated Deep Learning(FDL)is a privacy-by-design method that will enable the distributed training of machine learning models among distributed clients without sharing raw data and is suitable in heterogeneous urban settings.It is an overview of the privacy-preserving developments in FDL as of 2018-2025 with a narrow scope on its usage in smart cities(traffic prediction,environmental monitoring,energy grids),smart homes/buildings/IoT(non-intrusive load monitoring,HVAC optimization,anomaly detection)and the healthcare application(medical imaging,Electronic Health Records(EHR)analysis,remote monitoring).It gives coherent taxonomy,domain pipelines,comparative analyses of privacy mechanisms(differential privacy,secure aggregation,Homomorphic Encryption(HE),Trusted Execution Environments(TEEs),blockchain enhanced and hybrids),system structures,security/robustness defense,deployment/Machine Learning Operation(MLOps)issues,and the longstanding challenges(non-IID heterogeneity,communication efficiency,fairness,and sustainability).Some of the contributions made are structured comparisons of privacy threats,practical design advice on urban areas,recognition of open problems,and a research roadmap into the future up to 2035.The paper brings out the transformational worth of FDL in building credible,scalable,and sustainable intelligent urban ecosystems and the need to do further interdisciplinary research in standardization,real-world testbeds,and ethical governance. 展开更多
关键词 Federated deep learning(FDL) privacy-preserving AI smart cities smart homes/buildings federated healthcare intelligent urban ecosystems IOT
在线阅读 下载PDF
Optimizing the cyber-physical intelligent transportation system network using enhanced models for data routing and task scheduling
7
作者 Srinivasa Gowda G.K Hayder M.A.Ghanimi +5 位作者 Sudhakar Sengan Kolla Bhanu Prakash Meshal Alharbi Roobaea Alroobaea Sultan Algarni Abdullah M.Baqasah 《Digital Communications and Networks》 2026年第1期210-222,共13页
Advanced technologies like Cyber-Physical Systems(CPS)and the Internet of Things(IoT)have supported modernizing and automating the transportation region through the introduction of Intelligent Transportation Systems(I... Advanced technologies like Cyber-Physical Systems(CPS)and the Internet of Things(IoT)have supported modernizing and automating the transportation region through the introduction of Intelligent Transportation Systems(ITS).Integrating CPS-ITS and IoT provides real-time Vehicle-to-Infrastructure(V2I)communication,supporting better traffic management,safety,and efficiency.These technological innovations generate complex problems that need to be addressed,uniquely about data routing and Task Scheduling(TS)in ITS.Attempts to solve those problems were primarily based on traditional and experimental methods,and the solutions were not so successful due to the dynamic nature of ITS.This is where the scope of Machine learning(ML)and Swarm Intelligence(SI)has significantly impacted dealing with these challenges;in this line,this research paper presents a novel method for TS and data routing in the CPS-ITS.This paper proposes using a cutting-edge ML algorithm for data transmission from CPS-ITS.This ML has Gated Linear Unit-approximated Reinforcement Learning(GLRL).Greedy Iterative-Particle Swarm Optimization(GI-PSO)has been recommended to develop the Particle Swarm Optimization(PSO)for TS.The primary objective of this study is to enhance the security and effectiveness of ITS systems that utilize CPS-ITS.This study trained and validated the models using a network simulation dataset of 50 nodes from numerous ITS environments.The experiments demonstrate that the proposed GLRL reduces End-toEnd Delay(EED)by 12%,enhances data size use from 83.6%to 88.6%,and achieves higher bandwidth allocation,particularly in high-demand scenarios such as multimedia data streams where adherence improved to 98.15%.Furthermore,the GLRL reduced Network Congestion(NC)by 5.5%,demonstrating its efficiency in managing complex traffic conditions across several environments.The model passed simulation tests in three different environments:urban(UE),suburban(SE),and rural(RE).It met the high bandwidth requirements,made task scheduling more efficient,and increased network throughput(NT).This proved that it was robust and flexible enough for scalable ITS applications.These innovations provide robust,scalable solutions for real-time traffic management,ultimately improving safety,reducing NC,and increasing overall NT.This study can affect ITS by developing it to be more responsive,safe,and effective and by creating a perfect method to set up UE,SE,and RE. 展开更多
关键词 Cyber-physical systems Internet of things Task scheduling optimization Gated linear unit Machine learning
在线阅读 下载PDF
Challenges and Limitations in Speech Recognition Technology:A Critical Review of Speech Signal Processing Algorithms,Tools and Systems 被引量:1
8
作者 Sneha Basak Himanshi Agrawal +4 位作者 Shreya Jena Shilpa Gite Mrinal Bachute Biswajeet Pradhan Mazen Assiri 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第5期1053-1089,共37页
Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computa... Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution. 展开更多
关键词 Speech recognition automatic speech recognition(ASR) mel-frequency cepstral coefficients(MFCC) hidden Markov model(HMM) artificial neural network(ANN)
在线阅读 下载PDF
A Latency-Aware and Fault-Tolerant Framework for Resource Scheduling and Data Management in Fog-Enabled Smart City Transportation Systems
9
作者 Ibrar Afzal Noor ul Amin +1 位作者 Zulfiqar Ahmad Abdulmohsen Algarni 《Computers, Materials & Continua》 SCIE EI 2025年第1期1377-1399,共23页
Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and ... Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and smart transportation systems.Fog computing tackles a range of challenges,including processing,storage,bandwidth,latency,and reliability,by locally distributing secure information through end nodes.Consisting of endpoints,fog nodes,and back-end cloud infrastructure,it provides advanced capabilities beyond traditional cloud computing.In smart environments,particularly within smart city transportation systems,the abundance of devices and nodes poses significant challenges related to power consumption and system reliability.To address the challenges of latency,energy consumption,and fault tolerance in these environments,this paper proposes a latency-aware,faulttolerant framework for resource scheduling and data management,referred to as the FORD framework,for smart cities in fog environments.This framework is designed to meet the demands of time-sensitive applications,such as those in smart transportation systems.The FORD framework incorporates latency-aware resource scheduling to optimize task execution in smart city environments,leveraging resources from both fog and cloud environments.Through simulation-based executions,tasks are allocated to the nearest available nodes with minimum latency.In the event of execution failure,a fault-tolerantmechanism is employed to ensure the successful completion of tasks.Upon successful execution,data is efficiently stored in the cloud data center,ensuring data integrity and reliability within the smart city ecosystem. 展开更多
关键词 Fog computing smart cities smart transportation data management fault tolerance resource scheduling
在线阅读 下载PDF
Optimizing CNN Architectures for Face Liveness Detection:Performance,Efficiency,and Generalization across Datasets 被引量:1
10
作者 Smita Khairnar Shilpa Gite +2 位作者 Biswajeet Pradhan Sudeep D.Thepade Abdullah Alamri 《Computer Modeling in Engineering & Sciences》 2025年第6期3677-3707,共31页
Face liveness detection is essential for securing biometric authentication systems against spoofing attacks,including printed photos,replay videos,and 3D masks.This study systematically evaluates pre-trained CNN model... Face liveness detection is essential for securing biometric authentication systems against spoofing attacks,including printed photos,replay videos,and 3D masks.This study systematically evaluates pre-trained CNN models—DenseNet201,VGG16,InceptionV3,ResNet50,VGG19,MobileNetV2,Xception,and InceptionResNetV2—leveraging transfer learning and fine-tuning to enhance liveness detection performance.The models were trained and tested on NUAA and Replay-Attack datasets,with cross-dataset generalization validated on SiW-MV2 to assess real-world adaptability.Performance was evaluated using accuracy,precision,recall,FAR,FRR,HTER,and specialized spoof detection metrics(APCER,NPCER,ACER).Fine-tuning significantly improved detection accuracy,with DenseNet201 achieving the highest performance(98.5%on NUAA,97.71%on Replay-Attack),while MobileNetV2 proved the most efficient model for real-time applications(latency:15 ms,memory usage:45 MB,energy consumption:30 mJ).A statistical significance analysis(paired t-tests,confidence intervals)validated these improvements.Cross-dataset experiments identified DenseNet201 and MobileNetV2 as the most generalizable architectures,with DenseNet201 achieving 86.4%accuracy on Replay-Attack when trained on NUAA,demonstrating robust feature extraction and adaptability.In contrast,ResNet50 showed lower generalization capabilities,struggling with dataset variability and complex spoofing attacks.These findings suggest that MobileNetV2 is well-suited for low-power applications,while DenseNet201 is ideal for high-security environments requiring superior accuracy.This research provides a framework for improving real-time face liveness detection,enhancing biometric security,and guiding future advancements in AI-driven anti-spoofing techniques. 展开更多
关键词 Face liveness detection cross-dataset generalization real-time face authentication transfer learning DenseNet201 VGG16 InceptionV3 deep learning
在线阅读 下载PDF
Deep Learning and Federated Learning in Human Activity Recognition with Sensor Data:A Comprehensive Review
11
作者 Farhad Mortezapour Shiri Thinagaran Perumal +1 位作者 Norwati Mustapha Raihani Mohamed 《Computer Modeling in Engineering & Sciences》 2025年第11期1389-1485,共97页
Human Activity Recognition(HAR)represents a rapidly advancing research domain,propelled by continuous developments in sensor technologies and the Internet of Things(IoT).Deep learning has become the dominant paradigm ... Human Activity Recognition(HAR)represents a rapidly advancing research domain,propelled by continuous developments in sensor technologies and the Internet of Things(IoT).Deep learning has become the dominant paradigm in sensor-based HAR systems,offering significant advantages over traditional machine learning methods by eliminating manual feature extraction,enhancing recognition accuracy for complex activities,and enabling the exploitation of unlabeled data through generative models.This paper provides a comprehensive review of recent advancements and emerging trends in deep learning models developed for sensor-based human activity recognition(HAR)systems.We begin with an overview of fundamental HAR concepts in sensor-driven contexts,followed by a systematic categorization and summary of existing research.Our survey encompasses a wide range of deep learning approaches,including Multi-Layer Perceptrons(MLP),Convolutional Neural Networks(CNN),Recurrent Neural Networks(RNN),Long Short-Term Memory networks(LSTM),Gated Recurrent Units(GRU),Transformers,Deep Belief Networks(DBN),and hybrid architectures.A comparative evaluation of these models is provided,highlighting their performance,architectural complexity,and contributions to the field.Beyond Centralized deep learning models,we examine the role of Federated Learning(FL)in HAR,highlighting current applications and research directions.Finally,we discuss the growing importance of Explainable Artificial Intelligence(XAI)in sensor-based HAR,reviewing recent studies that integrate interpretability methods to enhance transparency and trustworthiness in deep learning-based HAR systems. 展开更多
关键词 Human activity recognition(HAR) machine learning deep learning SENSORS Internet of Things federated learning(FL) explainable AI(XAI)
在线阅读 下载PDF
A Novel Reliable and Trust Objective Function for RPL-Based IoT Routing Protocol
12
作者 Mariam A.Alotaibi Sami S.Alwakeel Aasem N.Alyahya 《Computers, Materials & Continua》 2025年第2期3467-3497,共31页
The Internet of Things (IoT) integrates diverse devices into the Internet infrastructure, including sensors, meters, and wearable devices. Designing efficient IoT networks with these heterogeneous devices requires the... The Internet of Things (IoT) integrates diverse devices into the Internet infrastructure, including sensors, meters, and wearable devices. Designing efficient IoT networks with these heterogeneous devices requires the selection of appropriate routing protocols, which is crucial for maintaining high Quality of Service (QoS). The Internet Engineering Task Force’s Routing Over Low Power and Lossy Networks (IETF ROLL) working group developed the IPv6 Routing Protocol for Low Power and Lossy Networks (RPL) to meet these needs. While the initial RPL standard focused on single-metric route selection, ongoing research explores enhancing RPL by incorporating multiple routing metrics and developing new Objective Functions (OFs). This paper introduces a novel Objective Function (OF), the Reliable and Secure Objective Function (RSOF), designed to enhance the reliability and trustworthiness of parent selection at both the node and link levels within IoT and RPL routing protocols. The RSOF employs an adaptive parent node selection mechanism that incorporates multiple metrics, including Residual Energy (RE), Expected Transmission Count (ETX), Extended RPL Node Trustworthiness (ERNT), and a novel metric that measures node failure rate (NFR). In this mechanism, nodes with a high NFR are excluded from the parent selection process to improve network reliability and stability. The proposed RSOF was evaluated using random and grid topologies in the Cooja Simulator, with tests conducted across small, medium, and large-scale networks to examine the impact of varying node densities. The simulation results indicate a significant improvement in network performance, particularly in terms of average latency, packet acknowledgment ratio (PAR), packet delivery ratio (PDR), and Control Message Overhead (CMO), compared to the standard Minimum Rank with Hysteresis Objective Function (MRHOF). 展开更多
关键词 IOT LLNs RPL objective function OF MRHOF OF0 routing metrics RELIABILITY trustworthiness
在线阅读 下载PDF
A Review of Human Vulnerabilities in Cyber Security: Challenges and Solutions for Microfinance Institutions
13
作者 Evaline Waweru Simon Maina Karume Alex Kibet 《Journal of Information Security》 2025年第1期114-130,共17页
This review examines human vulnerabilities in cybersecurity within Microfinance Institutions, analyzing their impact on organizational resilience. Focusing on social engineering, inadequate security training, and weak... This review examines human vulnerabilities in cybersecurity within Microfinance Institutions, analyzing their impact on organizational resilience. Focusing on social engineering, inadequate security training, and weak internal protocols, the study identifies key vulnerabilities exacerbating cyber threats to MFIs. A literature review using databases like IEEE Xplore and Google Scholar focused on studies from 2019 to 2023 addressing human factors in cybersecurity specific to MFIs. Analysis of 57 studies reveals that phishing and insider threats are predominant, with a 20% annual increase in phishing attempts. Employee susceptibility to these attacks is heightened by insufficient training, with entry-level employees showing the highest vulnerability rates. Further, only 35% of MFIs offer regular cybersecurity training, significantly impacting incident reduction. This paper recommends enhanced training frequency, robust internal controls, and a cybersecurity-aware culture to mitigate human-induced cyber risks in MFIs. 展开更多
关键词 Human Vulnerabilities CYBERSECURITY Microfinance Institutions Cyber Threats Cybersecurity Awareness Risk Mitigation
在线阅读 下载PDF
Hybrid Techniques of Multi-CNN and Ensemble Learning to Analyze Handwritten Spiral and Wave Drawing for Diagnosing Parkinson's Disease
14
作者 Mohammed Al-Jabbar Mohammed Alshahrani +3 位作者 Ebrahim Mohammed Senan Ibrahim Abunadi Sultan Ahmed Almalki Eman A Alshari 《Computer Modeling in Engineering & Sciences》 2025年第5期2429-2457,共29页
Parkinson’s disease(PD)is a progressive neurodegenerative disorder characterized by tremors,rigidity,and decreased movement.PD poses risks to individuals’lives and independence.Early detection of PD is essential bec... Parkinson’s disease(PD)is a progressive neurodegenerative disorder characterized by tremors,rigidity,and decreased movement.PD poses risks to individuals’lives and independence.Early detection of PD is essential because it allows timely intervention,which can slow disease progression and improve outcomes.Manual diagnosis of PD is problematic because it is difficult to capture the subtle patterns and changes that help diagnose PD.In addition,the subjectivity and lack of doctors compared to the number of patients constitute an obstacle to early diagnosis.Artificial intelligence(AI)techniques,especially deep and automated learning models,provide promising solutions to address deficiencies in manual diagnosis.This study develops robust systems for PD diagnosis by analyzing handwritten helical and wave graphical images.Handwritten graphic images of the PD dataset are enhanced using two overlapping filters,the average filter and the Laplacian filter,to improve image quality and highlight essential features.The enhanced images are segmented to isolate regions of interest(ROIs)from the rest of the image using a gradient vector flow(GVF)algorithm,which ensures that features are extracted from only relevant regions.The segmented ROIs are fed into convolutional neural network(CNN)models,namely DenseNet169,MobileNet,and VGG16,to extract fine and deep feature maps that capture complex patterns and representations relevant to PD diagnosis.Fine and deep feature maps extracted from individual CNN models are combined into fused feature vectors for DenseNet169-MobileNet,MobileNet-VGG16,DenseNet169-VGG16,and DenseNet169-MobileNet-VGG16 models.This fusion technique aims to combine complementary and robust features from several models,which improves the extracted features.Two feature selection algorithms are considered to remove redundancy and weak correlations within the combined feature set:Ant Colony Optimization(ACO)and Maximum Entropy Score-based Selection(MESbS).These algorithms identify and retain the most strongly correlated features while eliminating redundant and weakly correlated features,thus optimizing the features to improve system performance.The fused and enhanced feature vectors are fed into two powerful classifiers,XGBoost and random forest(RF),for accurate classification and differentiation between individuals with PD and healthy controls.The proposed hybrid systems show superior performance,where the RF classifier used the combined features from the DenseNet169-MobileNet-VGG16 models with the ACO feature selection method,achieving outstanding results:area under the curve(AUC)of 99%,sensitivity of 99.6%,99.3%accuracy,99.35%accuracy,and 99.65%specificity. 展开更多
关键词 CNN XGBoost RF GVF fusion feature PD
在线阅读 下载PDF
A dual-approach to genomic predictions:leveraging convolutional networks and voting classifiers
15
作者 Raghad K.Mohammed Azmi Tawfeq Hussein Alrawi Ali Jbaeer Dawood 《Biomedical Engineering Communications》 2025年第1期3-11,共9页
Background:In the field of genetic diagnostics,DNA sequencing is an important tool because the depth and complexity of this field have major implications in light of the genetic architectures of diseases and the ident... Background:In the field of genetic diagnostics,DNA sequencing is an important tool because the depth and complexity of this field have major implications in light of the genetic architectures of diseases and the identification of risk factors associated with genetic disorders.Methods:Our study introduces a novel two-tiered analytical framework to raise the precision and reliability of genetic data interpretation.It is initiated by extracting and analyzing salient features from DNA sequences through a CNN-based feature analysis,taking advantage of the power inherent in Convolutional neural networks(CNNs)to attain complex patterns and minute mutations in genetic data.This study embraces an elite collection of machine learning classifiers interweaved through a stern voting mechanism,which synergistically joins the predictions made from multiple classifiers to generate comprehensive and well-balanced interpretations of the genetic data.Results:This state-of-the-art method was further tested by carrying out an empirical analysis on a variants'dataset of DNA sequences taken from patients affected by breast cancer,juxtaposed with a control group composed of healthy people.Thus,the integration of CNNs with a voting-based ensemble of classifiers returned outstanding outcomes,with performance metrics accuracy,precision,recall,and F1-scorereaching the outstanding rate of 0.88,outperforming previous models.Conclusions:This dual accomplishment underlines the transformative potential that integrating deep learning techniques with ensemble machine learning might provide in real added value for further genetic diagnostics and prognostics.These results from this study set a new benchmark in the accuracy of disease diagnosis through DNA sequencing and promise future studies on improved personalized medicine and healthcare approaches with precise genetic information. 展开更多
关键词 CNN DNA sequencing ensemble machine learning genetic disease voting classifier
在线阅读 下载PDF
Multi-objective Markov-enhanced adaptive whale optimization cybersecurity model for binary and multi-class malware cyberthreat classification
16
作者 Saif Ali Abd Alradha Alsaidi Riyadh Rahef Nuiaa Al Ogaili +3 位作者 Zaid Abdi Alkareem Alyasseri Dhiah Al-Shammary Ayman Ibaida Adam Slowik 《Journal of Electronic Science and Technology》 2025年第4期95-112,共18页
The rapid and increasing growth in the volume and number of cyber threats from malware is not a real danger;the real threat lies in the obfuscation of these cyberattacks,as they constantly change their behavior,making... The rapid and increasing growth in the volume and number of cyber threats from malware is not a real danger;the real threat lies in the obfuscation of these cyberattacks,as they constantly change their behavior,making detection more difficult.Numerous researchers and developers have devoted considerable attention to this topic;however,the research field has not yet been fully saturated with high-quality studies that address these problems.For this reason,this paper presents a novel multi-objective Markov-enhanced adaptive whale optimization(MOMEAWO)cybersecurity model to improve the classification of binary and multi-class malware threats through the proposed MOMEAWO approach.The proposed MOMEAWO cybersecurity model aims to provide an innovative solution for analyzing,detecting,and classifying the behavior of obfuscated malware within their respective families.The proposed model includes three classification types:Binary classification and multi-class classification(e.g.,four families and 16 malware families).To evaluate the performance of this model,we used a recently published dataset called the Canadian Institute for Cybersecurity Malware Memory Analysis(CIC-MalMem-2022)that contains balanced data.The results show near-perfect accuracy in binary classification and high accuracy in multi-class classification compared with related work using the same dataset. 展开更多
关键词 Malware cybersecurity attacks Malware detection and classification Markov chain MULTI-OBJECTIVE MOMEAWO cybersecurity model
在线阅读 下载PDF
A Survey on Token Transmission Attacks,Effects,and Mitigation Strategies in IoT Devices
17
作者 Michael Juma Ayuma Shem Mbandu Angolo Philemon Nthenge Kasyoka 《Journal on Artificial Intelligence》 2025年第1期205-254,共50页
The exponential growth of Internet of Things(IoT)devices has introduced significant security challenges,particularly in securing token-based communication protocols used for authentication and authorization.This surve... The exponential growth of Internet of Things(IoT)devices has introduced significant security challenges,particularly in securing token-based communication protocols used for authentication and authorization.This survey systematically reviews the vulnerabilities in token transmission within IoT environments,focusing on various sophisticated attack vectors such as replay attacks,token hijacking,man-in-the-middle(MITM)attacks,token injection,and eavesdropping among others.These attacks exploit the inherent weaknesses of token-based mechanisms like OAuth,JSON Web Tokens(JWT),and bearer tokens,which are widely used in IoT ecosystems for managing device interactions and access control.The impact of such attacks is profound,leading to unauthorized access,data exfiltration,and control over IoT devices,posing significant threats to privacy,safety,and the operational integrity of critical IoT applications in sectors like healthcare,smart cities,and industrial automation.This paper categorizes these attack vectors,explores real-world case studies,and analyzes their effects on resource-constrained IoT devices that have limited processing power and memory,rendering them more susceptible to such exploits.Furthermore,this survey presents a comprehensive evaluation of existing mitigation techniques,including cryptographic protocols,lightweight secure transmission frameworks,secure token management practices,and network-layer defenses such as Transport Layer Security(TLS)and multi-factor authentication(MFA).The study also highlights the trade-offs between security and performance in IoT systems and identifies key gaps in current research,emphasizing the need for more scalable,energy-efficient,and robust security frameworks to address the evolving landscape of token transmission attacks in IoT devices. 展开更多
关键词 Token transmission IoT attacks IoT authentication CRYPTOGRAPHY ENCRYPTION
在线阅读 下载PDF
An Impact-Aware and Taxonomy-Driven Explainable Machine Learning Framework with Edge Computing for Security in Industrial IoT–Cyber Physical Systems
18
作者 Tamara Zhukabayeva Zulfiqar Ahmad +4 位作者 Nurbolat Tasbolatuly Makpal Zhartybayeva Yerik Mardenov Nurdaulet Karabayev Dilaram Baumuratova 《Computer Modeling in Engineering & Sciences》 2025年第11期2573-2599,共27页
The Industrial Internet of Things(IIoT),combined with the Cyber-Physical Systems(CPS),is transforming industrial automation but also poses great cybersecurity threats because of the complexity and connectivity of the ... The Industrial Internet of Things(IIoT),combined with the Cyber-Physical Systems(CPS),is transforming industrial automation but also poses great cybersecurity threats because of the complexity and connectivity of the systems.There is a lack of explainability,challenges with imbalanced attack classes,and limited consideration of practical edge–cloud deployment strategies in prior works.In the proposed study,we suggest an Impact-Aware Taxonomy-Driven Machine Learning Framework with Edge Deployment and SHapley Additive exPlanations(SHAP)-based Explainable AI(XAI)to attack detection and classification in IIoT-CPS settings.It includes not only unsupervised clustering(K-Means and DBSCAN)to extract latent traffic patterns but also supervised classification based on taxonomy to classify 33 different kinds of attacks into seven high-level categories:Flood Attacks,Botnet/Mirai,Reconnaissance,Spoofing/Man-In-The-Middle(MITM),Injection Attacks,Backdoors/Exploits,and Benign.The three machine learning algorithms,Random Forest,XGBoost,and Multi-Layer Perceptron(MLP),were trained on a realworld dataset of more than 1 million network traffic records,with overall accuracy of 99.4%(RF),99.5%(XGBoost),and 99.1%(MLP).Rare types of attacks,such as injection attacks and backdoors,were examined even in the case of extreme imbalance between the classes.SHAP-based XAI was performed on every model to help gain transparency and trust in the model and identify important features that drive the classification decisions,such as inter-arrival time,TCP flags,and protocol type.A workable edge-computing implementation strategy is proposed,whereby lightweight computing is performed at the edge devices and heavy,computation-intensive analytics is performed at the cloud.This framework is highly accurate,interpretable,and has real-time application,hence a robust and scalable solution to securing IIoT-CPS infrastructure against dynamic cyber-attacks. 展开更多
关键词 Industrial IoT CPS edge computing machine learning XAI attack taxonomy
在线阅读 下载PDF
Navigating the Complexities of Controller Placement in SD-WANs:A Multi-Objective Perspective on Current Trends and Future Challenges
19
作者 Abdulrahman M.Abdulghani Azizol Abdullah +3 位作者 A.R.Rahiman Nor Asilah Wati Abdul Hamid Bilal Omar Akram Hafsa Raissouli 《Computer Systems Science & Engineering》 2025年第1期123-157,共35页
This review article provides a comprehensive analysis of the latest advancements and persistent challenges in Software-Defined Wide Area Networks(SD-WANs),with a particular emphasis on the multi-objective Controller P... This review article provides a comprehensive analysis of the latest advancements and persistent challenges in Software-Defined Wide Area Networks(SD-WANs),with a particular emphasis on the multi-objective Controller Placement Problem(CPP).As SD-WAN technology continues to gain prominence for its capacity to offer flexible and efficient network management,the task of 36optimally placing controllers—responsible for orchestrating and managing network traffic—remains a critical yet complex challenge.This review delves into recent innovations in multi-objective controller placement strategies,including clustering techniques,heuristic-based approaches,and the integration of machine learning and deep learning models.Each methodology is critically evaluated in terms of its ability to minimize network latency,enhance fault tolerance,and improve overall network performance.Furthermore,this paper discusses the inherent limitations and challenges associated with these techniques,providing a critical evaluation of their current utility and outlining potential avenues for future research.By offering a thorough overview of state-of-the-art approaches to multi-objective controller placement in SD-WANs,this review aims to inform ongoing advancements and highlight emerging research opportunities in this evolving field. 展开更多
关键词 SDN SD-WAN multi-objectives controller placement problem(CPP) clustering algorithm heuristic algorithm fault tolerance
在线阅读 下载PDF
Security and Privacy in Permissioned Blockchain Interoperability:A Systematic Review
20
作者 Alsoudi Dua TanFong Ang +5 位作者 Chin Soon Ku Okmi Mohammed Yu Luo Jiahui Chen Uzair Aslam Bhatti Lip Yee Por 《Computers, Materials & Continua》 2025年第11期2579-2624,共46页
Blockchain interoperability enables seamless communication and asset transfer across isolated permissioned blockchain systems,but it introduces significant security and privacy vulnerabilities.This review aims to syst... Blockchain interoperability enables seamless communication and asset transfer across isolated permissioned blockchain systems,but it introduces significant security and privacy vulnerabilities.This review aims to systematically assess the security and privacy landscape of interoperability protocols for permissioned blockchains,identifying key properties,attack vectors,and countermeasures.Using PRISMA 2020 guidelines,we analysed 56 peerreviewed studies published between 2020 and 2025,retrieved from Scopus,ScienceDirect,Web of Science,and IEEE Xplore.The review focused on interoperability protocols for permissioned blockchains with security and privacy analyses,including only English-language journal articles and conference proceedings.Risk of bias in the included studies was assessed using the MMAT.Methods for presenting and synthesizing results included descriptive analysis,bibliometric analysis,and content analysis,with findings organized into tables,charts,and comparative summaries.The review classifies interoperability protocols into relay,sidechain,notary scheme,HTLC,and hybrid types and identifies 18 security and privacy properties along with 31 known attack types.Relay-based protocols showed the broadest security coverage,while HTLC and notary schemes demonstrated significant security gaps.Notably,93% of studies examined fewer than four properties or attack types,indicating a fragmented research landscape.The review identifies underexplored areas such as ACID properties,decentralization,and cross-chain attack resilience.It further highlights effective countermeasures,including cryptographic techniques,trusted execution environments,zero-knowledge proofs,and decentralized identity schemes.The findings suggest that despite growing adoption,current interoperability protocols lack comprehensive security evaluations.More holistic research is needed to ensure the resilience,trustworthiness,and scalability of cross-chain operations in permissioned blockchain ecosystems. 展开更多
关键词 Blockchain security PRIVACY ATTACK THREAT INTEROPERABILITY cross-chain
在线阅读 下载PDF
上一页 1 2 21 下一页 到第
使用帮助 返回顶部