期刊文献+
共找到8篇文章
< 1 >
每页显示 20 50 100
IoT Smart Devices Risk Assessment Model Using Fuzzy Logic and PSO 被引量:1
1
作者 Ashraf S.Mashaleh Noor Farizah Binti Ibrahim +2 位作者 mohammad alauthman mohammad Almseidin Amjad Gawanmeh 《Computers, Materials & Continua》 SCIE EI 2024年第2期2245-2267,共23页
Increasing Internet of Things(IoT)device connectivity makes botnet attacks more dangerous,carrying catastrophic hazards.As IoT botnets evolve,their dynamic and multifaceted nature hampers conventional detection method... Increasing Internet of Things(IoT)device connectivity makes botnet attacks more dangerous,carrying catastrophic hazards.As IoT botnets evolve,their dynamic and multifaceted nature hampers conventional detection methods.This paper proposes a risk assessment framework based on fuzzy logic and Particle Swarm Optimization(PSO)to address the risks associated with IoT botnets.Fuzzy logic addresses IoT threat uncertainties and ambiguities methodically.Fuzzy component settings are optimized using PSO to improve accuracy.The methodology allows for more complex thinking by transitioning from binary to continuous assessment.Instead of expert inputs,PSO data-driven tunes rules and membership functions.This study presents a complete IoT botnet risk assessment system.The methodology helps security teams allocate resources by categorizing threats as high,medium,or low severity.This study shows how CICIoT2023 can assess cyber risks.Our research has implications beyond detection,as it provides a proactive approach to risk management and promotes the development of more secure IoT environments. 展开更多
关键词 IoT botnet detection risk assessment fuzzy logic particle swarm optimization(PSO) CYBERSECURITY interconnected devices
在线阅读 下载PDF
Ensemble-Based Approach for Efficient Intrusion Detection in Network Traffic 被引量:2
2
作者 Ammar Almomani Iman Akour +5 位作者 Ahmed M.Manasrah Omar Almomani mohammad alauthman Esra’a Abdullah Amaal Al Shwait Razan Al Sharaa 《Intelligent Automation & Soft Computing》 SCIE 2023年第8期2499-2517,共19页
The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge... The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge for firewalls to detect and prevent due to the similarity between legit-imate and intrusion traffic.The vast network traffic volume also complicates most network monitoring systems and algorithms.Several intrusion detection methods have been proposed,with machine learning techniques regarded as promising for dealing with these incidents.This study presents an Intrusion Detection System Based on Stacking Ensemble Learning base(Random For-est,Decision Tree,and k-Nearest-Neighbors).The proposed system employs pre-processing techniques to enhance classification efficiency and integrates seven machine learning algorithms.The stacking ensemble technique increases performance by incorporating three base models(Random Forest,Decision Tree,and k-Nearest-Neighbors)and a meta-model represented by the Logistic Regression algorithm.Evaluated using the UNSW-NB15 dataset,the pro-posed IDS gained an accuracy of 96.16%in the training phase and 97.95%in the testing phase,with precision of 97.78%,and 98.40%for taring and testing,respectively.The obtained results demonstrate improvements in other measurement criteria. 展开更多
关键词 Intrusion detection system(IDS) machine learning techniques stacking ensemble random forest decision tree k-nearest-neighbor
在线阅读 下载PDF
Smart Shoes Safety System for the Blind People Based on (IoT) Technology
3
作者 Ammar Almomani mohammad alauthman +4 位作者 Amal Malkawi Hadeel Shwaihet Batool Aldigide Donia Aldabeek Karmen Abu Hamoodeh 《Computers, Materials & Continua》 SCIE EI 2023年第7期415-436,共22页
People’s lives have become easier and simpler as technology has proliferated.This is especially true with the Internet of Things(IoT).The biggest problem for blind people is figuring out how to get where they want to... People’s lives have become easier and simpler as technology has proliferated.This is especially true with the Internet of Things(IoT).The biggest problem for blind people is figuring out how to get where they want to go.People with good eyesight need to help these people.Smart shoes are a technique that helps blind people find their way when they walk.So,a special shoe has been made to help blind people walk safely without worrying about running into other people or solid objects.In this research,we are making a new safety system and a smart shoe for blind people.The system is based on Internet of Things(IoT)technology and uses three ultrasonic sensors to allow users to hear and react to barriers.It has ultrasonic sensors and a microprocessor that can tell how far away something is and if there are any obstacles.Water and flame sensors were used,and a sound was used to let the person know if an obstacle was near him.The sensors use Global Positioning System(GPS)technology to detect motion from almost every side to keep an eye on them and ensure they are safe.To test our proposal,we gave a questionnaire to 100 people.The questionnaire has eleven questions,and 99.1%of the people who filled it out said that the product meets their needs. 展开更多
关键词 IOT smart shoe sensors GSM GPS ARDUINO blind people safety system
在线阅读 下载PDF
Cyberbullying Detection and Recognition with Type Determination Based on Machine Learning
4
作者 Khalid M.O.Nahar mohammad alauthman +1 位作者 Saud Yonbawi Ammar Almomani 《Computers, Materials & Continua》 SCIE EI 2023年第6期5307-5319,共13页
Social media networks are becoming essential to our daily activities,and many issues are due to this great involvement in our lives.Cyberbullying is a social media network issue,a global crisis affecting the victims a... Social media networks are becoming essential to our daily activities,and many issues are due to this great involvement in our lives.Cyberbullying is a social media network issue,a global crisis affecting the victims and society as a whole.It results from a misunderstanding regarding freedom of speech.In this work,we proposed a methodology for detecting such behaviors(bullying,harassment,and hate-related texts)using supervised machine learning algo-rithms(SVM,Naïve Bayes,Logistic regression,and random forest)and for predicting a topic associated with these text data using unsupervised natural language processing,such as latent Dirichlet allocation.In addition,we used accuracy,precision,recall,and F1 score to assess prior classifiers.Results show that the use of logistic regression,support vector machine,random forest model,and Naïve Bayes has 95%,94.97%,94.66%,and 93.1%accuracy,respectively. 展开更多
关键词 CYBERBULLYING social media naïve bayes support vector machine natural language processing LDA
在线阅读 下载PDF
Age and Gender Classification Using Backpropagation and Bagging Algorithms
5
作者 Ammar Almomani Mohammed Alweshah +6 位作者 Waleed Alomoush mohammad alauthman Aseel Jabai Anwar Abbass Ghufran Hamad Meral Abdalla Brij B.Gupta 《Computers, Materials & Continua》 SCIE EI 2023年第2期3045-3062,共18页
Voice classification is important in creating more intelligent systems that help with student exams,identifying criminals,and security systems.The main aim of the research is to develop a system able to predicate and ... Voice classification is important in creating more intelligent systems that help with student exams,identifying criminals,and security systems.The main aim of the research is to develop a system able to predicate and classify gender,age,and accent.So,a newsystem calledClassifyingVoice Gender,Age,and Accent(CVGAA)is proposed.Backpropagation and bagging algorithms are designed to improve voice recognition systems that incorporate sensory voice features such as rhythm-based features used to train the device to distinguish between the two gender categories.It has high precision compared to other algorithms used in this problem,as the adaptive backpropagation algorithm had an accuracy of 98%and the Bagging algorithm had an accuracy of 98.10%in the gender identification data.Bagging has the best accuracy among all algorithms,with 55.39%accuracy in the voice common dataset and age classification and accent accuracy in a speech accent of 78.94%. 展开更多
关键词 Classify voice gender ACCENT age bagging algorithms back propagation algorithms AI classifiers
在线阅读 下载PDF
A Robust Model for Translating Arabic Sign Language into Spoken Arabic Using Deep Learning
6
作者 Khalid M.O.Nahar Ammar Almomani +1 位作者 Nahlah Shatnawi mohammad alauthman 《Intelligent Automation & Soft Computing》 SCIE 2023年第8期2037-2057,共21页
This study presents a novel and innovative approach to auto-matically translating Arabic Sign Language(ATSL)into spoken Arabic.The proposed solution utilizes a deep learning-based classification approach and the trans... This study presents a novel and innovative approach to auto-matically translating Arabic Sign Language(ATSL)into spoken Arabic.The proposed solution utilizes a deep learning-based classification approach and the transfer learning technique to retrain 12 image recognition models.The image-based translation method maps sign language gestures to corre-sponding letters or words using distance measures and classification as a machine learning technique.The results show that the proposed model is more accurate and faster than traditional image-based models in classifying Arabic-language signs,with a translation accuracy of 93.7%.This research makes a significant contribution to the field of ATSL.It offers a practical solution for improving communication for individuals with special needs,such as the deaf and mute community.This work demonstrates the potential of deep learning techniques in translating sign language into natural language and highlights the importance of ATSL in facilitating communication for individuals with disabilities. 展开更多
关键词 Sign language deep learning transfer learning machine learning automatic translation of sign language natural language processing Arabic sign language
在线阅读 下载PDF
Analyzing darknet traffic through machine learning and neucube spiking neural networks
7
作者 Iman Akour mohammad alauthman +2 位作者 Khalid M.O.Nahar Ammar Almomani Brij B.Gupta 《Intelligent and Converged Networks》 2024年第4期265-283,共19页
The rapidly evolving darknet enables a wide range of cybercrimes through anonymous and untraceable communication channels.Effective detection of clandestine darknet traffic is therefore critical yet immensely challeng... The rapidly evolving darknet enables a wide range of cybercrimes through anonymous and untraceable communication channels.Effective detection of clandestine darknet traffic is therefore critical yet immensely challenging.This research demonstrates how advanced machine learning and specialized deep learning techniques can significantly enhance darknet traffic analysis to strengthen cybersecurity.Combining diverse classifiers such as random forest and naïve Bayes with a novel spiking neural network architecture provides a robust foundation for identifying concealed threats.Evaluation on the CIC-Darknet2020 dataset establishes state-of-the-art results with 98%accuracy from the random forest model and 84.31%accuracy from the spiking neural network.This pioneering application of artificial intelligence advances the frontiers in analyzing the complex characteristics and behaviours of darknet communication.The proposed techniques lay the groundwork for improved threat intelligence,real-time monitoring,and resilient cyber defense systems against the evolving landscape of cyber threats. 展开更多
关键词 darknet traffic machine learning deep learning spiking neural network PCA
原文传递
Priority-Based Scheduling and Orchestration in Edge-Cloud Computing:A Deep Reinforcement Learning-Enhanced Concurrency Control Approach
8
作者 mohammad A Al Khaldy Ahmad Nabot +4 位作者 Ahmad Al-Qerem mohammad alauthman Amina Salhi Suhaila Abuowaida Naceur Chihaoui 《Computer Modeling in Engineering & Sciences》 2025年第10期673-697,共25页
The exponential growth of Internet ofThings(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet ... The exponential growth of Internet ofThings(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet the stringent latency requirements of modern IoT systems,while pure edge computing faces resource constraints that limit processing capabilities.This paper addresses these challenges by proposing a novel Deep Reinforcement Learning(DRL)-enhanced priority-based scheduling framework for hybrid edge-cloud computing environments.Our approach integrates adaptive priority assignment with a two-level concurrency control protocol that ensures both optimal performance and data consistency.The framework introduces three key innovations:(1)a DRL-based dynamic priority assignmentmechanism that learns fromsystem behavior,(2)a hybrid concurrency control protocol combining local edge validation with global cloud coordination,and(3)an integrated mathematical model that formalizes sensor-driven transactions across edge-cloud architectures.Extensive simulations across diverse workload scenarios demonstrate significant quantitative improvements:40%latency reduction,25%throughput increase,85%resource utilization(compared to 60%for heuristicmethods),40%reduction in energy consumption(300 vs.500 J per task),and 50%improvement in scalability factor(1.8 vs.1.2 for EDF)compared to state-of-the-art heuristic and meta-heuristic approaches.These results establish the framework as a robust solution for large-scale IoT and autonomous applications requiring real-time processing with consistency guarantees. 展开更多
关键词 Edge computing cloud computing scheduling algorithms orchestration strategies deep reinforcement learning concurrency control real-time systems IoT
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部