期刊文献+
共找到23篇文章
< 1 2 >
每页显示 20 50 100
Enhancing Bandwidth Utilization of IP Telephony Over IPv6 Networks
1
作者 Hani Al-Mimi Yousef Alrabanah +1 位作者 Mosleh M.Abualhaj Sumaya N.Al-khatib 《Computer Systems Science & Engineering》 SCIE EI 2023年第2期1039-1049,共11页
The demand for the telecommunication services,such as IP telephony,has increased dramatically during the COVID-19 pandemic lockdown.IP tele-phony should be enhanced to provide the expected quality.One of the issues th... The demand for the telecommunication services,such as IP telephony,has increased dramatically during the COVID-19 pandemic lockdown.IP tele-phony should be enhanced to provide the expected quality.One of the issues that should be investigated in IP telephony is bandwidth utilization.IP telephony pro-duces very small speech samples attached to a large packet header.The header of the IP telephony consumes a considerable share of the bandwidth allotted to the IP telephony.This wastes the network's bandwidth and influences the IP telephony quality.This paper proposes a mechanism(called Smallerize)that reduces the bandwidth consumed by both the speech sample and the header.This is achieved by assembling numerous IP telephony packets in one header and use the header'sfields to carry the speech sample.Several metrics have been used to measure the achievement Smallerize mechanism.The number of calls has been increased by 245.1%compared to the typical mechanism.The bandwidth saving has also reached 68%with the G.28 codec.Therefore,Smallerize is a possible mechanism to enhance bandwidth utilization of the IP telephony. 展开更多
关键词 –IP telephony CODEC bandwidth utilization IPV6
在线阅读 下载PDF
Transfer Learning-Based Approach with an Ensemble Classifier for Detecting Keylogging Attack on the Internet of Things
2
作者 Yahya Alhaj Maz Mohammed Anbar +3 位作者 Selvakumar Manickam Mosleh MAbualhaj Sultan Ahmed Almalki Basim Ahmad Alabsi 《Computers, Materials & Continua》 2025年第12期5287-5307,共21页
The Internet of Things(IoT)is an innovation that combines imagined space with the actual world on a single platform.Because of the recent rapid rise of IoT devices,there has been a lack of standards,leading to a massi... The Internet of Things(IoT)is an innovation that combines imagined space with the actual world on a single platform.Because of the recent rapid rise of IoT devices,there has been a lack of standards,leading to a massive increase in unprotected devices connecting to networks.Consequently,cyberattacks on IoT are becoming more common,particularly keylogging attacks,which are often caused by security vulnerabilities on IoT networks.This research focuses on the role of transfer learning and ensemble classifiers in enhancing the detection of keylogging attacks within small,imbalanced IoT datasets.The authors propose a model that combines transfer learning with ensemble classification methods,leading to improved detection accuracy.By leveraging the BoT-IoT and keylogger_detection datasets,they facilitate the transfer of knowledge across various domains.The results reveal that the integration of transfer learning and ensemble classifiers significantly improves detection capabilities,even in scenarios with limited data availability.The proposed TRANS-ENS model showcases exceptional accuracy and a minimal false positive rate,outperforming current deep learning approaches.The primary objectives include:(i)introducing an ensemble feature selection technique to identify common features across models,(ii)creating a pre-trained deep learning model through transfer learning for the detection of keylogging attacks,and(iii)developing a transfer learning-ensemble model dedicated to keylogging detection.Experimental findings indicate that the TRANS-ENS model achieves a detection accuracy of 96.06%and a false alarm rate of 0.12%,surpassing existing models such as CNN,RNN,and LSTM. 展开更多
关键词 Convolutional neural network deep learning keylogging attack recurrent neural network transfer learning
在线阅读 下载PDF
Deep Learning Approach for Automated Estimation of 3D Vertebral Orientation of the Lumbar Spine
3
作者 Nanfang Xu Shanshan Liu +12 位作者 Yuepeng Chen Kailai Zhang Chenyi Guo Cheng Zhang Fei Xu Qifeng Lan Wanyi Fu Xingyu Zhou Bo Zhao Aodong He Xiangling Fu Ji Wu Weishi Li 《CAAI Transactions on Intelligence Technology》 2025年第5期1306-1319,共14页
Lumbar degenerative disc diseases constitute a major contributor to lower back pain.In pursuit of an enhanced understanding of lumbar degenerative pathology and the development of more effective treatment modalities,t... Lumbar degenerative disc diseases constitute a major contributor to lower back pain.In pursuit of an enhanced understanding of lumbar degenerative pathology and the development of more effective treatment modalities,the application of precise measurement techniques for lumbar segment kinematics is imperative.This study aims to pioneer a novel automated lumbar spine orientation estimation method using deep learning techniques,to facilitate the automatic 2D–3D pre-registration of the lumbar spine during physiological movements,to enhance the efficiency of image registration and the accuracy of spinal segment kinematic measurements.A total of 12 asymptomatic volunteers were enrolled and captured in 2 oblique views with 7 different postures.Images were used for deep learning model development training and evaluation.The model was composed of a segmentation module using Mask R-CNN and an estimation module using ResNet50 architecture with a Squeeze-and-Excitation module.The cosine value of the angle between the prediction vector and the vector of ground truth was used to quantify the model performance.Data from another two prospective recruited asymptomatic volunteers were used to compare the time cost between model-assisted registration and manual registration without a model.The cosine values of vector deviation angles at three axes in the cartesian coordinate system were 0.9667�0.004,0.9593�0.0047 and 0.9828�0.0025,respectively.The value of the angular deviation between the intermediate vector obtained by utilising the three direction vectors and ground truth was 10.7103�0.7466.Results show the consistency and reliability of the model's predictions across different experiments and axes and demonstrate that our approach significantly reduces the registration time(3.47�0.90 min vs.8.10�1.60 min,p<0.001),enhances the efficiency,and expands its broader utilisation of clinical research about kinematic measurements. 展开更多
关键词 artificial intelligence deep learning
在线阅读 下载PDF
Extensive Study of Cloud Computing Technologies, Threats and Solutions Prospective 被引量:1
4
作者 Mwaffaq Abu-Alhaija Nidal M.Turab AbdelRahman Hamza 《Computer Systems Science & Engineering》 SCIE EI 2022年第4期225-240,共16页
Infrastructure as a Service(IaaS)provides logical separation between data,network,applications and machines from the physical constrains of real machines.IaaS is one of the basis of cloud virtualization.Recently,secur... Infrastructure as a Service(IaaS)provides logical separation between data,network,applications and machines from the physical constrains of real machines.IaaS is one of the basis of cloud virtualization.Recently,security issues are also gradually emerging with virtualization of cloud computing.Different security aspects of cloud virtualization will be explored in this research paper,security recognizing potential threats or attacks that exploit these vulnerabilities,and what security measures are used to alleviate such threats.In addition,a dis-cussion of general security requirements and the existing security schemes is also provided.As shown in this paper,different components of virtualization environ-ment are targets to various attacks that in turn leads to security issues compromis-ing the whole cloud infrastructure.In this paper an overview of various cloud security aspects is also provided.Different attack scenarios of virtualization envir-onments and security solutions to cater these attacks have been discussed in the paper.We then proceed to discuss API security concerns,data security,hijacking of user account and other security concerns.The aforementioned discussions can be used in the future to propose assessment criteria,which could be useful in ana-lyzing the efficiency of security solutions of virtualization environment in the face of various virtual environment attacks. 展开更多
关键词 Cloud computing environment paravirtualization full virtualization cloud virtualization security HYPERVISOR virtual machines
在线阅读 下载PDF
Suitability of Suluu-Terek Basalt Deposits for Stone Casting 被引量:1
5
作者 Zhanbolot Aidaraliev Imiyla Rysbaeva +7 位作者 Rakhat Atyrova Akimbek Abdykalykov Baktygul Bekbolot Kyzy Chynara Zholdoshova Nematilla Sopubekov Altynbek Kuduev Iurii Dubinin Zhypargul Abdullaeva 《Journal of Minerals and Materials Characterization and Engineering》 2022年第1期1-14,共14页
This article is presenting history of stone casting and analysis of basalt raw materials assessment from other countries for stone casting technology and various basalts compositions were considered. Analytical method... This article is presenting history of stone casting and analysis of basalt raw materials assessment from other countries for stone casting technology and various basalts compositions were considered. Analytical methods for calculating composition of charge require a long calculation time and plotting diagrams, each method has its own advantages and disadvantages. As a research significance, we have proposed an experimental calculation method for calculating raw materials after charging. Analysis of the composition, structure of basalts and charging materials were used in the stone casting technology. According to the comparison method, the required amount of charging materials was calculated for the Suluu-Terek 1, Suluu-Terek 2 and Berestovetsk deposits. The calculated data was confirmed by experimental melts in the process of stone casting. 展开更多
关键词 PETROLOGY Basalt-Stone Casting Suitability of Raw Materials Aluminosilicate Crystalline Alloy Charging Raw Material Comparison Method
在线阅读 下载PDF
Classification of Cybersecurity Threats, Vulnerabilities and Countermeasures in Database Systems
6
作者 Mohammed Amin Almaiah Leen Mohammad Saqr +3 位作者 Leen Ahmad Al-Rawwash Layan Ahmed Altellawi Romel Al-Ali Omar Almomani 《Computers, Materials & Continua》 SCIE EI 2024年第11期3189-3220,共32页
Database systems have consistently been prime targets for cyber-attacks and threats due to the critical nature of the data they store.Despite the increasing reliance on database management systems,this field continues... Database systems have consistently been prime targets for cyber-attacks and threats due to the critical nature of the data they store.Despite the increasing reliance on database management systems,this field continues to face numerous cyber-attacks.Database management systems serve as the foundation of any information system or application.Any cyber-attack can result in significant damage to the database system and loss of sensitive data.Consequently,cyber risk classifications and assessments play a crucial role in risk management and establish an essential framework for identifying and responding to cyber threats.Risk assessment aids in understanding the impact of cyber threats and developing appropriate security controls to mitigate risks.The primary objective of this study is to conduct a comprehensive analysis of cyber risks in database management systems,including classifying threats,vulnerabilities,impacts,and countermeasures.This classification helps to identify suitable security controls to mitigate cyber risks for each type of threat.Additionally,this research aims to explore technical countermeasures to protect database systems from cyber threats.This study employs the content analysis method to collect,analyze,and classify data in terms of types of threats,vulnerabilities,and countermeasures.The results indicate that SQL injection attacks and Denial of Service(DoS)attacks were the most prevalent technical threats in database systems,each accounting for 9%of incidents.Vulnerable audit trails,intrusion attempts,and ransomware attacks were classified as the second level of technical threats in database systems,comprising 7%and 5%of incidents,respectively.Furthermore,the findings reveal that insider threats were the most common non-technical threats in database systems,accounting for 5%of incidents.Moreover,the results indicate that weak authentication,unpatched databases,weak audit trails,and multiple usage of an account were the most common technical vulnerabilities in database systems,each accounting for 9%of vulnerabilities.Additionally,software bugs,insecure coding practices,weak security controls,insecure networks,password misuse,weak encryption practices,and weak data masking were classified as the second level of security vulnerabilities in database systems,each accounting for 4%of vulnerabilities.The findings from this work can assist organizations in understanding the types of cyber threats and developing robust strategies against cyber-attacks. 展开更多
关键词 Cyber threats database systems cyber risk assessment VULNERABILITIES COUNTERMEASURES
在线阅读 下载PDF
Leveraging Deep Learning for Precision-Aware Road Accident Detection
7
作者 Kunal Thakur Ashu Taneja +1 位作者 Ali Alqahtani Nayef Alqahtani 《Computers, Materials & Continua》 2025年第12期4827-4848,共22页
Accident detection plays a critical role in improving traffic safety by enabling timely emergency response and reducing the impact of road incidents.The main challenge lies in achieving real-time,reliable and highly a... Accident detection plays a critical role in improving traffic safety by enabling timely emergency response and reducing the impact of road incidents.The main challenge lies in achieving real-time,reliable and highly accurate detection across diverse Internet-of-vehicles(IoV)environments.To overcome this challenge,this paper leverages deep learning to automatically learn patterns from visual data to detect accidents with high accuracy.A visual classification model based on the ResNet-50 architecture is presented for distinguishing between accident and non-accident images.The model is trained and tested on a labeled dataset and achieves an overall accuracy of 91.84%,with a precision of 94%,recall of 90.38%,and an F1-score of 92.14%.Training behavior is observed over 100 epochs,where the model has shown rapid accuracy gains and loss reduction within the first 30 epochs,followed by gradual stabilization.Accuracy plateaues between 90−93%,and loss values remain consistent between 0.1 and 0.2 in later stages.To understand the effect of training strategy,the model is optimized using three different algorithms,namely,SGD,Adam,and Adadelta with all showing effective performance,though with varied convergence patterns.Further,to test its effectiveness,the proposed model is compared with existing models.In the end,the problems encountered in implementing the model in practical automotive settings and offered solutions are discussed.The results support the reliability of the approach and its suitability for real-time traffic safety applications. 展开更多
关键词 Accident detection deep learning ResNet-50 traffic safety image classification OPTIMIZER real-time monitoring
在线阅读 下载PDF
Real-Time 7-Core SDM Transmission System Using Commercial 400 Gbit/s OTN Transceivers and Network Management System
8
作者 CUI Jian GU Ninglun +2 位作者 CHANG Cheng SHI Hu YAN Baoluo 《ZTE Communications》 2025年第3期81-88,共8页
Space-division multiplexing(SDM)utilizing uncoupled multi-core fibers(MCF)is considered a promising candidate for nextgeneration high-speed optical transmission systems due to its huge capacity and low inter-core cros... Space-division multiplexing(SDM)utilizing uncoupled multi-core fibers(MCF)is considered a promising candidate for nextgeneration high-speed optical transmission systems due to its huge capacity and low inter-core crosstalk.In this paper,we demonstrate a realtime high-speed SDM transmission system over a field-deployed 7-core MCF cable using commercial 400 Gbit/s backbone optical transport network(OTN)transceivers and a network management system.The transceivers employ a high noise-tolerant quadrature phase shift keying(QPSK)modulation format with a 130 Gbaud rate,enabled by optoelectronic multi-chip module(OE-MCM)packaging.The network management system can effectively manage and monitor the performance of the 7-core SDM OTN system and promptly report failure events through alarms.Our field trial demonstrates the compatibility of uncoupled MCF with high-speed OTN transmission equipment and network management systems,supporting its future deployment in next-generation high-speed terrestrial cable transmission networks. 展开更多
关键词 multi-core fiber real-time transmission optical transport network field trial network management system
在线阅读 下载PDF
A Real-Time Deep Learning Approach for Electrocardiogram-Based Cardiovascular Disease Prediction with Adaptive Drift Detection and Generative Feature Replay
9
作者 Soumia Zertal Asma Saighi +2 位作者 Sofia Kouah Souham Meshoul Zakaria Laboudi 《Computer Modeling in Engineering & Sciences》 2025年第9期3737-3782,共46页
Cardiovascular diseases(CVDs)continue to present a leading cause ofmortalityworldwide,emphasizing the importance of early and accurate prediction.Electrocardiogram(ECG)signals,central to cardiac monitoring,have increa... Cardiovascular diseases(CVDs)continue to present a leading cause ofmortalityworldwide,emphasizing the importance of early and accurate prediction.Electrocardiogram(ECG)signals,central to cardiac monitoring,have increasingly been integratedwithDeep Learning(DL)for real-time prediction of CVDs.However,DL models are prone to performance degradation due to concept drift and to catastrophic forgetting.To address this issue,we propose a realtime CVDs prediction approach,referred to as ADWIN-GFR that combines Convolutional Neural Network(CNN)layers,for spatial feature extraction,with Gated Recurrent Units(GRU),for temporal modeling,alongside adaptive drift detection and mitigation mechanisms.The proposed approach integratesAdaptiveWindowing(ADWIN)for realtime concept drift detection,a fine-tuning strategy based on Generative Features Replay(GFR)to preserve previously acquired knowledge,and a dynamic replay buffer ensuring variance,diversity,and data distribution coverage.Extensive experiments conducted on the MIT-BIH arrhythmia dataset demonstrate that ADWIN-GFR outperforms standard fine-tuning techniques,achieving an average post-drift accuracy of 95.4%,amacro F1-score of 93.9%,and a remarkably low forgetting score of 0.9%.It also exhibits an average drift detection delay of 12 steps and achieves an adaptation gain of 17.2%.These findings underscore the potential of ADWIN-GFR for deployment in real-world cardiac monitoring systems,including wearable ECG devices and hospital-based patient monitoring platforms. 展开更多
关键词 Real-time cardiovascular disease prediction concept drift detection catastrophic forgetting fine-tuning electrocardiogram convolutional neural networks gated recurrent units adaptive windowing generative feature replay
在线阅读 下载PDF
Narwhal Optimizer:A Nature-Inspired Optimization Algorithm for Solving Complex Optimization Problems
10
作者 Raja Masadeh Omar Almomani +4 位作者 Abdullah Zaqebah Shayma Masadeh Kholoud Alshqurat Ahmad Sharieh Nesreen Alsharman 《Computers, Materials & Continua》 2025年第11期3709-3737,共29页
This research presents a novel nature-inspired metaheuristic optimization algorithm,called theNarwhale Optimization Algorithm(NWOA).The algorithm draws inspiration from the foraging and prey-hunting strategies of narw... This research presents a novel nature-inspired metaheuristic optimization algorithm,called theNarwhale Optimization Algorithm(NWOA).The algorithm draws inspiration from the foraging and prey-hunting strategies of narwhals,“unicorns of the sea”,particularly the use of their distinctive spiral tusks,which play significant roles in hunting,searching prey,navigation,echolocation,and complex social interaction.Particularly,the NWOA imitates the foraging strategies and techniques of narwhals when hunting for prey but focuses mainly on the cooperative and exploratory behavior shown during group hunting and in the use of their tusks in sensing and locating prey under the Arctic ice.These functions provide a strong assessment basis for investigating the algorithm’s prowess at balancing exploration and exploitation,convergence speed,and solution accuracy.The performance of the NWOA is evaluated on 30 benchmark test functions.A comparison study using the Grey Wolf Optimizer(GWO),Whale Optimization Algorithm(WOA),Perfumer Optimization Algorithm(POA),Candle Flame Optimization(CFO)Algorithm,Particle Swarm Optimization(PSO)Algorithm,and Genetic Algorithm(GA)validates the results.As evidenced in the experimental results,NWOA is capable of yielding competitive outcomes among these well-known optimizers,whereas in several instances.These results suggest thatNWOAhas proven to be an effective and robust optimization tool suitable for solving many different complex optimization problems from the real world. 展开更多
关键词 Optimization metaheuristic optimization algorithm narwhal optimization algorithm benchmarks
在线阅读 下载PDF
Efficient Malicious QR Code Detection System Using an Advanced Deep Learning Approach
11
作者 Abdulaziz A.Alsulami Qasem Abu Al-Haija +4 位作者 Badraddin Alturki Ayman Yafoz Ali Alqahtani Raed Alsini Sami Saeed Binyamin 《Computer Modeling in Engineering & Sciences》 2025年第10期1117-1140,共24页
QR codes are widely used in applications such as information sharing,advertising,and digital payments.However,their growing adoption has made them attractive targets for malicious activities,including malware distribu... QR codes are widely used in applications such as information sharing,advertising,and digital payments.However,their growing adoption has made them attractive targets for malicious activities,including malware distribution and phishing attacks.Traditional detection approaches rely on URL analysis or image-based feature extraction,whichmay introduce significant computational overhead and limit real-time applicability,and their performance often depends on the quality of extracted features.Previous studies in malicious detection do not fully focus on QR code securitywhen combining convolutional neural networks(CNNs)with recurrent neural networks(RNNs).This research proposes a deep learning model that integrates AlexNet for feature extraction,principal component analysis(PCA)for dimensionality reduction,and RNNs to detect malicious activity in QR code images.The proposed model achieves both efficiency and accuracy by transforming image data into a compact one-dimensional sequence.Experimental results,including five-fold cross-validation,demonstrate that the model using gated recurrent units(GRU)achieved an accuracy of 99.81%on the first dataset and 99.59%in the second dataset with a computation time of only 7.433 ms per sample.A real-time prototype was also developed to demonstrate deployment feasibility.These results highlight the potential of the proposed approach for practical,real-time QR code threat detection. 展开更多
关键词 CYBERSECURITY quick response(QR)code deep learning recurrent neural network(RNN) gated recurrent unit(GRU) long short-term memory(LSTM)
在线阅读 下载PDF
Recent Techniques for Harvesting Energy from the Human Body 被引量:1
12
作者 Nidal M.Turab Hamza Abu Owida +1 位作者 Jamal I.Al-Nabulsi Mwaffaq Abu-Alhaija 《Computer Systems Science & Engineering》 SCIE EI 2022年第1期167-177,共11页
The human body contains a near-infinite supply of energy in chemical,thermal,and mechanical forms.However,the majority of implantable and wear-able devices are still operated by batteries,whose insufficient capacity a... The human body contains a near-infinite supply of energy in chemical,thermal,and mechanical forms.However,the majority of implantable and wear-able devices are still operated by batteries,whose insufficient capacity and large size limit their lifespan and increase the risk of hazardous material leakage.Such energy can be used to exceed the battery power limits of implantable and wear-able devices.Moreover,novel materials and fabrication methods can be used to create various medical therapies and life-enhancing technologies.This review paper focuses on energy-harvesting technologies used in medical and health applications,primarily power collectors from the human body.Current approaches to energy harvesting from the bodies of living subjects for self-powered electronics are summarized.Using the human body as an energy source encompasses numer-ous topics:thermoelectric generators,power harvesting by kinetic energy,cardi-ovascular energy harvesting,and blood pressure.The review considers various perspectives on future research,which can provide a new forum for advancing new technologies for the diagnosis,treatment,and prevention of diseases by integrating different energy harvesters with advanced electronics. 展开更多
关键词 Harvesters implantable medical devices thermoelectric generators cardiovascular systems human motion harvesters
在线阅读 下载PDF
A Robust Asynchrophasor in PMU Using Second-Order Kalman Filter
13
作者 Nayef Alqahtani Ali Alqahtani 《Computers, Materials & Continua》 SCIE EI 2023年第2期2557-2573,共17页
Phasor Measurement Units(PMUs)provide Global Positioning System(GPS)time-stamped synchronized measurements of voltage and current with the phase angle of the system at certain points along with the grid system.Those s... Phasor Measurement Units(PMUs)provide Global Positioning System(GPS)time-stamped synchronized measurements of voltage and current with the phase angle of the system at certain points along with the grid system.Those synchronized data measurements are extracted in the form of amplitude and phase from various locations of the power grid to monitor and control the power system condition.A PMU device is a crucial part of the power equipment in terms of the cost and operative point of view.However,such ongoing development and improvement to PMUs’principal work are essential to the network operators to enhance the grid quality and the operating expenses.This paper introduces a proposed method that led to lowcost and less complex techniques to optimize the performance of PMU using Second-Order Kalman Filter.It is based on the Asyncrhophasor technique resulting in a phase error minimization when receiving the signal from an access point or from the main access point.The MATLAB model has been created to implement the proposed method in the presence of Gaussian and non-Gaussian.The results have shown the proposed method which is Second-Order Kalman Filter outperforms the existing model.The results were tested usingMean Square Error(MSE).The proposed Second-Order Kalman Filter method has been replaced with a synchronization unit into thePMUstructure to clarify the significance of the proposed new PMU. 展开更多
关键词 Distributed generation asynchrophasor kalman filter phasor estimation phasor measurement unit state variables mean square error and signal to noise ratio
在线阅读 下载PDF
AMachine Learning Approach to Cyberbullying Detection in Arabic Tweets
14
作者 Dhiaa Musleh Atta Rahman +8 位作者 Mohammed Abbas Alkherallah Menhal Kamel Al-Bohassan Mustafa Mohammed Alawami Hayder Ali Alsebaa Jawad Ali Alnemer Ghazi Fayez Al-Mutairi May Issa Aldossary Dalal A.Aldowaihi Fahd Alhaidari 《Computers, Materials & Continua》 SCIE EI 2024年第7期1033-1054,共22页
With the rapid growth of internet usage,a new situation has been created that enables practicing bullying.Cyberbullying has increased over the past decade,and it has the same adverse effects as face-to-face bullying,l... With the rapid growth of internet usage,a new situation has been created that enables practicing bullying.Cyberbullying has increased over the past decade,and it has the same adverse effects as face-to-face bullying,like anger,sadness,anxiety,and fear.With the anonymity people get on the internet,they tend to bemore aggressive and express their emotions freely without considering the effects,which can be a reason for the increase in cyberbullying and it is the main motive behind the current study.This study presents a thorough background of cyberbullying and the techniques used to collect,preprocess,and analyze the datasets.Moreover,a comprehensive review of the literature has been conducted to figure out research gaps and effective techniques and practices in cyberbullying detection in various languages,and it was deduced that there is significant room for improvement in the Arabic language.As a result,the current study focuses on the investigation of shortlisted machine learning algorithms in natural language processing(NLP)for the classification of Arabic datasets duly collected from Twitter(also known as X).In this regard,support vector machine(SVM),Naive Bayes(NB),Random Forest(RF),Logistic regression(LR),Bootstrap aggregating(Bagging),Gradient Boosting(GBoost),Light Gradient Boosting Machine(LightGBM),Adaptive Boosting(AdaBoost),and eXtreme Gradient Boosting(XGBoost)were shortlisted and investigated due to their effectiveness in the similar problems.Finally,the scheme was evaluated by well-known performance measures like accuracy,precision,Recall,and F1-score.Consequently,XGBoost exhibited the best performance with 89.95%accuracy,which is promising compared to the state-of-the-art. 展开更多
关键词 Supervised machine learning ensemble learning CYBERBULLYING Arabic tweets NLP
在线阅读 下载PDF
Accurate Phase Detection for ZigBee Using Artificial Neural Network
15
作者 Ali Alqahtani Abdulaziz A.Alsulami +1 位作者 Saeed Alahmari Mesfer Alrizq 《Intelligent Automation & Soft Computing》 SCIE 2023年第6期2505-2518,共14页
The IEEE802.15.4 standard has been widely used in modern industry due to its several benefits for stability,scalability,and enhancement of wireless mesh networking.This standard uses a physical layer of binary phase-s... The IEEE802.15.4 standard has been widely used in modern industry due to its several benefits for stability,scalability,and enhancement of wireless mesh networking.This standard uses a physical layer of binary phase-shift keying(BPSK)modulation and can be operated with two frequency bands,868 and 915 MHz.The frequency noise could interfere with the BPSK signal,which causes distortion to the signal before its arrival at receiver.Therefore,filtering the BPSK signal from noise is essential to ensure carrying the signal from the sen-der to the receiver with less error.Therefore,removing signal noise in the BPSK signal is necessary to mitigate its negative sequences and increase its capability in industrial wireless sensor networks.Moreover,researchers have reported a posi-tive impact of utilizing the Kalmen filter in detecting the modulated signal at the receiver side in different communication systems,including ZigBee.Mean-while,artificial neural network(ANN)and machine learning(ML)models outper-formed results for predicting signals for detection and classification purposes.This paper develops a neural network predictive detection method to enhance the performance of BPSK modulation.First,a simulation-based model is used to generate the modulated signal of BPSK in the IEEE802.15.4 wireless personal area network(WPAN)standard.Then,Gaussian noise was injected into the BPSK simulation model.To reduce the noise of BPSK phase signals,a recurrent neural networks(RNN)model is implemented and integrated at the receiver side to esti-mate the BPSK’s phase signal.We evaluated our predictive-detection RNN model using mean square error(MSE),correlation coefficient,recall,and F1-score metrics.The result shows that our predictive-detection method is superior to the existing model due to the low MSE and correlation coefficient(R-value)metric for different signal-to-noise(SNR)values.In addition,our RNN-based model scored 98.71%and 96.34%based on recall and F1-score,respectively. 展开更多
关键词 Neural networks RNN BPSK phasedetection WPAN IEEE802.15.4 signal demodulation
在线阅读 下载PDF
Application Resource Management for Highly Computational Applications in the Operational Environment: A Critical Review
16
作者 Joseph Balikuddembe Jael Gudu 《Journal of Software Engineering and Applications》 2017年第9期777-786,共10页
Computational resources have such a significant influence on the operation of any software application, it is therefore important to understand how these applications utilize these resources. Modern resource-intensive... Computational resources have such a significant influence on the operation of any software application, it is therefore important to understand how these applications utilize these resources. Modern resource-intensive enterprise and scientific applications are creating a growing demand for high performance computing infrastructures. They constantly interact with and rely heavily on complex resources. However, they often operate in resource-limited environments yet they often handle massive data, both in size and complexity. Software application services, processes or transactions compete for the much required but scarce resources. This creates the need to improve the existing resource allocation and management issue in such operational environments, as well as propose new ones, if necessary. Software developers try to analyze application operation environment using diverse analysis and design methods. Our aim therefore, is to design a tool that is able to work with a hybrid of adaptive and prediction-based resource management and allocation models while applying the priority based job scheduling algorithm to try and solve the application resource management challenges currently being faced in such environments, even if, partially. 展开更多
关键词 RESOURCE Management Resource-Intensive APPLICATIONS RESOURCE ALLOCATION Design
暂未订购
A Neuro-Fuzzy Approach to Road Traffic Congestion Prediction
17
作者 Mohammed Gollapalli Atta-ur-Rahman +12 位作者 Dhiaa Musleh Nehad Ibrahim Muhammad Adnan Khan Sagheer Abbas Ayesha Atta Muhammad Aftab Khan Mehwash Farooqui Tahir Iqbal Mohammed Salih Ahmed Mohammed Imran BAhmed Dakheel Almoqbil Majd Nabeel Abdullah Omer 《Computers, Materials & Continua》 SCIE EI 2022年第10期295-310,共16页
The fast-paced growth of artificial intelligence applications provides unparalleled opportunities to improve the efficiency of various systems.Such as the transportation sector faces many obstacles following the imple... The fast-paced growth of artificial intelligence applications provides unparalleled opportunities to improve the efficiency of various systems.Such as the transportation sector faces many obstacles following the implementation and integration of different vehicular and environmental aspects worldwide.Traffic congestion is among the major issues in this regard which demands serious attention due to the rapid growth in the number of vehicles on the road.To address this overwhelming problem,in this article,a cloudbased intelligent road traffic congestion prediction model is proposed that is empowered with a hybrid Neuro-Fuzzy approach.The aim of the study is to reduce the delay in the queues,the vehicles experience at different road junctions across the city.The proposed model also intended to help the automated traffic control systems by minimizing the congestion particularly in a smart city environment where observational data is obtained from various implanted Internet of Things(IoT)sensors across the road.After due preprocessing over the cloud server,the proposed approach makes use of this data by incorporating the neuro-fuzzy engine.Consequently,it possesses a high level of accuracy by means of intelligent decision making with minimum error rate.Simulation results reveal the accuracy of the proposed model as 98.72%during the validation phase in contrast to the highest accuracies achieved by state-of-the-art techniques in the literature such as 90.6%,95.84%,97.56%and 98.03%,respectively.As far as the training phase analysis is concerned,the proposed scheme exhibits 99.214% accuracy. The proposed prediction modelis a potential contribution towards smart cities environment. 展开更多
关键词 NEURO-FUZZY machine learning congestion prediction AI cloud computing smart cities
在线阅读 下载PDF
Auto Opening Door and Car Identification
18
作者 Sanner H. Mahmood Oulla G. Hassan +1 位作者 Ayad M. Kwad Safa F. Abass 《Journal of Computer and Communications》 2016年第15期132-141,共10页
Design an Automatic Door System using a unique wireless ID by using infrared ray or Bluetooth technology. That consists of a sensing unit, control unit and drive unit to open and close doors at the entrance of a car t... Design an Automatic Door System using a unique wireless ID by using infrared ray or Bluetooth technology. That consists of a sensing unit, control unit and drive unit to open and close doors at the entrance of a car that has the unique ID. This process is controlled by using Arduino Leonard programmed with IDE free open source software, that receives the signal code from the car which sends the ID through IR LED or Bluetooth by using a mobile application, decode it. And switch ON the driver that controls the DC motor. This system was designed considering some factors such as low cost and low power requirements, availability of components and low distance so there is no interference. The hardware design and software development are described, and all of the tests indicate that all component goes according to the initial design of this research. 展开更多
关键词 Automatic Auto Opening CAR DOOR IDENTIFICATION SECURITY
在线阅读 下载PDF
Agile Factor Flexible Industry 4.0 with MES Manufacturing Execution System Along with ERP Back-End Integration: Ready-Made Garments as a Case Study
19
作者 Mohamed Mostafa Mohamed Abdulaziz Saleh Alraddadi 《American Journal of Operations Research》 2024年第4期137-150,共14页
Industry 4.0, or the Fourth Industrial Revolution, is based on digitized the manufacturing process and makes use of all digital tools so its combination of various digital technologies computers, ERP software, IoT, ma... Industry 4.0, or the Fourth Industrial Revolution, is based on digitized the manufacturing process and makes use of all digital tools so its combination of various digital technologies computers, ERP software, IoT, machine learning and AI techniques, Manufacturing Execution Systems (MES), and big data analytics to create a new, fully digitized manufacturing system. The Critical Success Factors (CSFs) of MES adoption are both a quantitative and qualitative measurement. We use the case of ready-made garments to improve each of the three Overall Equipment Efficiency (OEE) factors: Availability, Performance, and Quality. In this study, we adopt real-time management of production activities on the shop floor from order receipt to finished products, then measure the improvement. 展开更多
关键词 Industry 4.0 ERP MES IOT
在线阅读 下载PDF
Deep Learning Based Side-Channel Attack Detection for Mobile Devices Security in 5G Networks
20
作者 Amjed A.Ahmed Mohammad Kamrul Hasan +6 位作者 Ali Alqahtani Shayla Islam Bishwajeet Pandey Leila Rzayeva Huda Saleh Abbas Azana Hafizah Mohd Aman Nayef Alqahtani 《Tsinghua Science and Technology》 2025年第3期1012-1026,共15页
Mobile devices within Fifth Generation(5G)networks,typically equipped with Android systems,serve as a bridge to connect digital gadgets such as global positioning system,mobile devices,and wireless routers,which are v... Mobile devices within Fifth Generation(5G)networks,typically equipped with Android systems,serve as a bridge to connect digital gadgets such as global positioning system,mobile devices,and wireless routers,which are vital in facilitating end-user communication requirements.However,the security of Android systems has been challenged by the sensitive data involved,leading to vulnerabilities in mobile devices used in 5G networks.These vulnerabilities expose mobile devices to cyber-attacks,primarily resulting from security gaps.Zero-permission apps in Android can exploit these channels to access sensitive information,including user identities,login credentials,and geolocation data.One such attack leverages“zero-permission”sensors like accelerometers and gyroscopes,enabling attackers to gather information about the smartphone’s user.This underscores the importance of fortifying mobile devices against potential future attacks.Our research focuses on a new recurrent neural network prediction model,which has proved highly effective for detecting sidechannel attacks in mobile devices in 5G networks.We conducted state-of-the-art comparative studies to validate our experimental approach.The results demonstrate that even a small amount of training data can accurately recognize 37.5%of previously unseen user-typed words.Moreover,our tap detection mechanism achieves a 92%accuracy rate,a crucial factor for text inference.These findings have significant practical implications,as they reinforce mobile device security in 5G networks,enhancing user privacy,and data protection. 展开更多
关键词 Fifth Generation(5G)networks SMARTPHONE information leakage Side-Channel Attack(SCA) deep learning
原文传递
上一页 1 2 下一页 到第
使用帮助 返回顶部