Increasing Internet of Things(IoT)device connectivity makes botnet attacks more dangerous,carrying catastrophic hazards.As IoT botnets evolve,their dynamic and multifaceted nature hampers conventional detection method...Increasing Internet of Things(IoT)device connectivity makes botnet attacks more dangerous,carrying catastrophic hazards.As IoT botnets evolve,their dynamic and multifaceted nature hampers conventional detection methods.This paper proposes a risk assessment framework based on fuzzy logic and Particle Swarm Optimization(PSO)to address the risks associated with IoT botnets.Fuzzy logic addresses IoT threat uncertainties and ambiguities methodically.Fuzzy component settings are optimized using PSO to improve accuracy.The methodology allows for more complex thinking by transitioning from binary to continuous assessment.Instead of expert inputs,PSO data-driven tunes rules and membership functions.This study presents a complete IoT botnet risk assessment system.The methodology helps security teams allocate resources by categorizing threats as high,medium,or low severity.This study shows how CICIoT2023 can assess cyber risks.Our research has implications beyond detection,as it provides a proactive approach to risk management and promotes the development of more secure IoT environments.展开更多
The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge...The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge for firewalls to detect and prevent due to the similarity between legit-imate and intrusion traffic.The vast network traffic volume also complicates most network monitoring systems and algorithms.Several intrusion detection methods have been proposed,with machine learning techniques regarded as promising for dealing with these incidents.This study presents an Intrusion Detection System Based on Stacking Ensemble Learning base(Random For-est,Decision Tree,and k-Nearest-Neighbors).The proposed system employs pre-processing techniques to enhance classification efficiency and integrates seven machine learning algorithms.The stacking ensemble technique increases performance by incorporating three base models(Random Forest,Decision Tree,and k-Nearest-Neighbors)and a meta-model represented by the Logistic Regression algorithm.Evaluated using the UNSW-NB15 dataset,the pro-posed IDS gained an accuracy of 96.16%in the training phase and 97.95%in the testing phase,with precision of 97.78%,and 98.40%for taring and testing,respectively.The obtained results demonstrate improvements in other measurement criteria.展开更多
People’s lives have become easier and simpler as technology has proliferated.This is especially true with the Internet of Things(IoT).The biggest problem for blind people is figuring out how to get where they want to...People’s lives have become easier and simpler as technology has proliferated.This is especially true with the Internet of Things(IoT).The biggest problem for blind people is figuring out how to get where they want to go.People with good eyesight need to help these people.Smart shoes are a technique that helps blind people find their way when they walk.So,a special shoe has been made to help blind people walk safely without worrying about running into other people or solid objects.In this research,we are making a new safety system and a smart shoe for blind people.The system is based on Internet of Things(IoT)technology and uses three ultrasonic sensors to allow users to hear and react to barriers.It has ultrasonic sensors and a microprocessor that can tell how far away something is and if there are any obstacles.Water and flame sensors were used,and a sound was used to let the person know if an obstacle was near him.The sensors use Global Positioning System(GPS)technology to detect motion from almost every side to keep an eye on them and ensure they are safe.To test our proposal,we gave a questionnaire to 100 people.The questionnaire has eleven questions,and 99.1%of the people who filled it out said that the product meets their needs.展开更多
Social media networks are becoming essential to our daily activities,and many issues are due to this great involvement in our lives.Cyberbullying is a social media network issue,a global crisis affecting the victims a...Social media networks are becoming essential to our daily activities,and many issues are due to this great involvement in our lives.Cyberbullying is a social media network issue,a global crisis affecting the victims and society as a whole.It results from a misunderstanding regarding freedom of speech.In this work,we proposed a methodology for detecting such behaviors(bullying,harassment,and hate-related texts)using supervised machine learning algo-rithms(SVM,Naïve Bayes,Logistic regression,and random forest)and for predicting a topic associated with these text data using unsupervised natural language processing,such as latent Dirichlet allocation.In addition,we used accuracy,precision,recall,and F1 score to assess prior classifiers.Results show that the use of logistic regression,support vector machine,random forest model,and Naïve Bayes has 95%,94.97%,94.66%,and 93.1%accuracy,respectively.展开更多
Voice classification is important in creating more intelligent systems that help with student exams,identifying criminals,and security systems.The main aim of the research is to develop a system able to predicate and ...Voice classification is important in creating more intelligent systems that help with student exams,identifying criminals,and security systems.The main aim of the research is to develop a system able to predicate and classify gender,age,and accent.So,a newsystem calledClassifyingVoice Gender,Age,and Accent(CVGAA)is proposed.Backpropagation and bagging algorithms are designed to improve voice recognition systems that incorporate sensory voice features such as rhythm-based features used to train the device to distinguish between the two gender categories.It has high precision compared to other algorithms used in this problem,as the adaptive backpropagation algorithm had an accuracy of 98%and the Bagging algorithm had an accuracy of 98.10%in the gender identification data.Bagging has the best accuracy among all algorithms,with 55.39%accuracy in the voice common dataset and age classification and accent accuracy in a speech accent of 78.94%.展开更多
This study presents a novel and innovative approach to auto-matically translating Arabic Sign Language(ATSL)into spoken Arabic.The proposed solution utilizes a deep learning-based classification approach and the trans...This study presents a novel and innovative approach to auto-matically translating Arabic Sign Language(ATSL)into spoken Arabic.The proposed solution utilizes a deep learning-based classification approach and the transfer learning technique to retrain 12 image recognition models.The image-based translation method maps sign language gestures to corre-sponding letters or words using distance measures and classification as a machine learning technique.The results show that the proposed model is more accurate and faster than traditional image-based models in classifying Arabic-language signs,with a translation accuracy of 93.7%.This research makes a significant contribution to the field of ATSL.It offers a practical solution for improving communication for individuals with special needs,such as the deaf and mute community.This work demonstrates the potential of deep learning techniques in translating sign language into natural language and highlights the importance of ATSL in facilitating communication for individuals with disabilities.展开更多
The rapidly evolving darknet enables a wide range of cybercrimes through anonymous and untraceable communication channels.Effective detection of clandestine darknet traffic is therefore critical yet immensely challeng...The rapidly evolving darknet enables a wide range of cybercrimes through anonymous and untraceable communication channels.Effective detection of clandestine darknet traffic is therefore critical yet immensely challenging.This research demonstrates how advanced machine learning and specialized deep learning techniques can significantly enhance darknet traffic analysis to strengthen cybersecurity.Combining diverse classifiers such as random forest and naïve Bayes with a novel spiking neural network architecture provides a robust foundation for identifying concealed threats.Evaluation on the CIC-Darknet2020 dataset establishes state-of-the-art results with 98%accuracy from the random forest model and 84.31%accuracy from the spiking neural network.This pioneering application of artificial intelligence advances the frontiers in analyzing the complex characteristics and behaviours of darknet communication.The proposed techniques lay the groundwork for improved threat intelligence,real-time monitoring,and resilient cyber defense systems against the evolving landscape of cyber threats.展开更多
The exponential growth of Internet ofThings(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet ...The exponential growth of Internet ofThings(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet the stringent latency requirements of modern IoT systems,while pure edge computing faces resource constraints that limit processing capabilities.This paper addresses these challenges by proposing a novel Deep Reinforcement Learning(DRL)-enhanced priority-based scheduling framework for hybrid edge-cloud computing environments.Our approach integrates adaptive priority assignment with a two-level concurrency control protocol that ensures both optimal performance and data consistency.The framework introduces three key innovations:(1)a DRL-based dynamic priority assignmentmechanism that learns fromsystem behavior,(2)a hybrid concurrency control protocol combining local edge validation with global cloud coordination,and(3)an integrated mathematical model that formalizes sensor-driven transactions across edge-cloud architectures.Extensive simulations across diverse workload scenarios demonstrate significant quantitative improvements:40%latency reduction,25%throughput increase,85%resource utilization(compared to 60%for heuristicmethods),40%reduction in energy consumption(300 vs.500 J per task),and 50%improvement in scalability factor(1.8 vs.1.2 for EDF)compared to state-of-the-art heuristic and meta-heuristic approaches.These results establish the framework as a robust solution for large-scale IoT and autonomous applications requiring real-time processing with consistency guarantees.展开更多
文摘Increasing Internet of Things(IoT)device connectivity makes botnet attacks more dangerous,carrying catastrophic hazards.As IoT botnets evolve,their dynamic and multifaceted nature hampers conventional detection methods.This paper proposes a risk assessment framework based on fuzzy logic and Particle Swarm Optimization(PSO)to address the risks associated with IoT botnets.Fuzzy logic addresses IoT threat uncertainties and ambiguities methodically.Fuzzy component settings are optimized using PSO to improve accuracy.The methodology allows for more complex thinking by transitioning from binary to continuous assessment.Instead of expert inputs,PSO data-driven tunes rules and membership functions.This study presents a complete IoT botnet risk assessment system.The methodology helps security teams allocate resources by categorizing threats as high,medium,or low severity.This study shows how CICIoT2023 can assess cyber risks.Our research has implications beyond detection,as it provides a proactive approach to risk management and promotes the development of more secure IoT environments.
文摘The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge for firewalls to detect and prevent due to the similarity between legit-imate and intrusion traffic.The vast network traffic volume also complicates most network monitoring systems and algorithms.Several intrusion detection methods have been proposed,with machine learning techniques regarded as promising for dealing with these incidents.This study presents an Intrusion Detection System Based on Stacking Ensemble Learning base(Random For-est,Decision Tree,and k-Nearest-Neighbors).The proposed system employs pre-processing techniques to enhance classification efficiency and integrates seven machine learning algorithms.The stacking ensemble technique increases performance by incorporating three base models(Random Forest,Decision Tree,and k-Nearest-Neighbors)and a meta-model represented by the Logistic Regression algorithm.Evaluated using the UNSW-NB15 dataset,the pro-posed IDS gained an accuracy of 96.16%in the training phase and 97.95%in the testing phase,with precision of 97.78%,and 98.40%for taring and testing,respectively.The obtained results demonstrate improvements in other measurement criteria.
文摘People’s lives have become easier and simpler as technology has proliferated.This is especially true with the Internet of Things(IoT).The biggest problem for blind people is figuring out how to get where they want to go.People with good eyesight need to help these people.Smart shoes are a technique that helps blind people find their way when they walk.So,a special shoe has been made to help blind people walk safely without worrying about running into other people or solid objects.In this research,we are making a new safety system and a smart shoe for blind people.The system is based on Internet of Things(IoT)technology and uses three ultrasonic sensors to allow users to hear and react to barriers.It has ultrasonic sensors and a microprocessor that can tell how far away something is and if there are any obstacles.Water and flame sensors were used,and a sound was used to let the person know if an obstacle was near him.The sensors use Global Positioning System(GPS)technology to detect motion from almost every side to keep an eye on them and ensure they are safe.To test our proposal,we gave a questionnaire to 100 people.The questionnaire has eleven questions,and 99.1%of the people who filled it out said that the product meets their needs.
文摘Social media networks are becoming essential to our daily activities,and many issues are due to this great involvement in our lives.Cyberbullying is a social media network issue,a global crisis affecting the victims and society as a whole.It results from a misunderstanding regarding freedom of speech.In this work,we proposed a methodology for detecting such behaviors(bullying,harassment,and hate-related texts)using supervised machine learning algo-rithms(SVM,Naïve Bayes,Logistic regression,and random forest)and for predicting a topic associated with these text data using unsupervised natural language processing,such as latent Dirichlet allocation.In addition,we used accuracy,precision,recall,and F1 score to assess prior classifiers.Results show that the use of logistic regression,support vector machine,random forest model,and Naïve Bayes has 95%,94.97%,94.66%,and 93.1%accuracy,respectively.
文摘Voice classification is important in creating more intelligent systems that help with student exams,identifying criminals,and security systems.The main aim of the research is to develop a system able to predicate and classify gender,age,and accent.So,a newsystem calledClassifyingVoice Gender,Age,and Accent(CVGAA)is proposed.Backpropagation and bagging algorithms are designed to improve voice recognition systems that incorporate sensory voice features such as rhythm-based features used to train the device to distinguish between the two gender categories.It has high precision compared to other algorithms used in this problem,as the adaptive backpropagation algorithm had an accuracy of 98%and the Bagging algorithm had an accuracy of 98.10%in the gender identification data.Bagging has the best accuracy among all algorithms,with 55.39%accuracy in the voice common dataset and age classification and accent accuracy in a speech accent of 78.94%.
文摘This study presents a novel and innovative approach to auto-matically translating Arabic Sign Language(ATSL)into spoken Arabic.The proposed solution utilizes a deep learning-based classification approach and the transfer learning technique to retrain 12 image recognition models.The image-based translation method maps sign language gestures to corre-sponding letters or words using distance measures and classification as a machine learning technique.The results show that the proposed model is more accurate and faster than traditional image-based models in classifying Arabic-language signs,with a translation accuracy of 93.7%.This research makes a significant contribution to the field of ATSL.It offers a practical solution for improving communication for individuals with special needs,such as the deaf and mute community.This work demonstrates the potential of deep learning techniques in translating sign language into natural language and highlights the importance of ATSL in facilitating communication for individuals with disabilities.
文摘The rapidly evolving darknet enables a wide range of cybercrimes through anonymous and untraceable communication channels.Effective detection of clandestine darknet traffic is therefore critical yet immensely challenging.This research demonstrates how advanced machine learning and specialized deep learning techniques can significantly enhance darknet traffic analysis to strengthen cybersecurity.Combining diverse classifiers such as random forest and naïve Bayes with a novel spiking neural network architecture provides a robust foundation for identifying concealed threats.Evaluation on the CIC-Darknet2020 dataset establishes state-of-the-art results with 98%accuracy from the random forest model and 84.31%accuracy from the spiking neural network.This pioneering application of artificial intelligence advances the frontiers in analyzing the complex characteristics and behaviours of darknet communication.The proposed techniques lay the groundwork for improved threat intelligence,real-time monitoring,and resilient cyber defense systems against the evolving landscape of cyber threats.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R909),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The exponential growth of Internet ofThings(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet the stringent latency requirements of modern IoT systems,while pure edge computing faces resource constraints that limit processing capabilities.This paper addresses these challenges by proposing a novel Deep Reinforcement Learning(DRL)-enhanced priority-based scheduling framework for hybrid edge-cloud computing environments.Our approach integrates adaptive priority assignment with a two-level concurrency control protocol that ensures both optimal performance and data consistency.The framework introduces three key innovations:(1)a DRL-based dynamic priority assignmentmechanism that learns fromsystem behavior,(2)a hybrid concurrency control protocol combining local edge validation with global cloud coordination,and(3)an integrated mathematical model that formalizes sensor-driven transactions across edge-cloud architectures.Extensive simulations across diverse workload scenarios demonstrate significant quantitative improvements:40%latency reduction,25%throughput increase,85%resource utilization(compared to 60%for heuristicmethods),40%reduction in energy consumption(300 vs.500 J per task),and 50%improvement in scalability factor(1.8 vs.1.2 for EDF)compared to state-of-the-art heuristic and meta-heuristic approaches.These results establish the framework as a robust solution for large-scale IoT and autonomous applications requiring real-time processing with consistency guarantees.