Control signaling is mandatory for the operation and management of all types of communication networks,including the Third Generation Partnership Project(3GPP)mobile broadband networks.However,they consume important a...Control signaling is mandatory for the operation and management of all types of communication networks,including the Third Generation Partnership Project(3GPP)mobile broadband networks.However,they consume important and scarce network resources such as bandwidth and processing power.There have been several reports of these control signaling turning into signaling storms halting network operations and causing the respective Telecom companies big financial losses.This paper draws its motivation from such real network disaster incidents attributed to signaling storms.In this paper,we present a thorough survey of the causes,of the signaling storm problems in 3GPP-based mobile broadband networks and discuss in detail their possible solutions and countermeasures.We provide relevant analytical models to help quantify the effect of the potential causes and benefits of their corresponding solutions.Another important contribution of this paper is the comparison of the possible causes and solutions/countermeasures,concerning their effect on several important network aspects such as architecture,additional signaling,fidelity,etc.,in the form of a table.This paper presents an update and an extension of our earlier conference publication.To our knowledge,no similar survey study exists on the subject.展开更多
The transportation and logistics sectors are major contributors to Greenhouse Gase(GHG)emissions.Carbon dioxide(CO_(2))from Light-Duty Vehicles(LDVs)is posing serious risks to air quality and public health.Understandi...The transportation and logistics sectors are major contributors to Greenhouse Gase(GHG)emissions.Carbon dioxide(CO_(2))from Light-Duty Vehicles(LDVs)is posing serious risks to air quality and public health.Understanding the extent of LDVs’impact on climate change and human well-being is crucial for informed decisionmaking and effective mitigation strategies.This study investigates the predictability of CO_(2)emissions from LDVs using a comprehensive dataset that includes vehicles from various manufacturers,their CO_(2)emission levels,and key influencing factors.Specifically,sixMachine Learning(ML)algorithms,ranging fromsimple linearmodels to complex non-linear models,were applied under identical conditions to ensure a fair comparison and their performance metrics were calculated.The obtained results showed a significant influence of variables such as engine size on CO_(2)emissions.Although the six algorithms have provided accurate forecasts,the Linear Regression(LR)model was found to be sufficient,achieving a Mean Absolute Percentage Error(MAPE)below 0.90%and a Coefficient of Determination(R2)exceeding 99.7%.These findings may contribute to a deeper understanding of LDVs’role in CO_(2)emissions and offer actionable insights for reducing their environmental impact.In fact,vehicle manufacturers can leverage these insights to target key emission-related factors,while policymakers and stakeholders in logistics and transportation can use the models to estimate the CO_(2)emissions of new vehicles before their market deployment or to project future emissions from current and expected LDV fleets.展开更多
Leaf disease identification is one of the most promising applications of convolutional neural networks(CNNs).This method represents a significant step towards revolutionizing agriculture by enabling the quick and accu...Leaf disease identification is one of the most promising applications of convolutional neural networks(CNNs).This method represents a significant step towards revolutionizing agriculture by enabling the quick and accurate assessment of plant health.In this study,a CNN model was specifically designed and tested to detect and categorize diseases on fig tree leaves.The researchers utilized a dataset of 3422 images,divided into four classes:healthy,fig rust,fig mosaic,and anthracnose.These diseases can significantly reduce the yield and quality of fig tree fruit.The objective of this research is to develop a CNN that can identify and categorize diseases in fig tree leaves.The data for this study was collected from gardens in the Amandi and Mamash Khail Bannu districts of the Khyber Pakhtunkhwa region in Pakistan.To minimize the risk of overfitting and enhance the model’s performance,early stopping techniques and data augmentation were employed.As a result,the model achieved a training accuracy of 91.53%and a validation accuracy of 90.12%,which are considered respectable.This comprehensive model assists farmers in the early identification and categorization of fig tree leaf diseases.Our experts believe that CNNs could serve as valuable tools for accurate disease classification and detection in precision agriculture.We recommend further research to explore additional data sources and more advanced neural networks to improve the model’s accuracy and applicability.Future research will focus on expanding the dataset by including new diseases and testing the model in real-world scenarios to enhance sustainable farming practices.展开更多
The Internet of Things (IoT) and edge-assisted networking infrastructures are capable of bringing data processing and accessibility services locally at the respective edge rather than at a centralized module. These in...The Internet of Things (IoT) and edge-assisted networking infrastructures are capable of bringing data processing and accessibility services locally at the respective edge rather than at a centralized module. These infrastructures are very effective in providing a fast response to the respective queries of the requesting modules, but their distributed nature has introduced other problems such as security and privacy. To address these problems, various security-assisted communication mechanisms have been developed to safeguard every active module, i.e., devices and edges, from every possible vulnerability in the IoT. However, these methodologies have neglected one of the critical issues, which is the prediction of fraudulent devices, i.e., adversaries, preferably as early as possible in the IoT. In this paper, a hybrid communication mechanism is presented where the Hidden Markov Model (HMM) predicts the legitimacy of the requesting device (both source and destination), and the Advanced Encryption Standard (AES) safeguards the reliability of the transmitted data over a shared communication medium, preferably through a secret shared key, i.e., , and timestamp information. A device becomes trusted if it has passed both evaluation levels, i.e., HMM and message decryption, within a stipulated time interval. The proposed hybrid, along with existing state-of-the-art approaches, has been simulated in the realistic environment of the IoT to verify the security measures. These evaluations were carried out in the presence of intruders capable of launching various attacks simultaneously, such as man-in-the-middle, device impersonations, and masquerading attacks. Moreover, the proposed approach has been proven to be more effective than existing state-of-the-art approaches due to its exceptional performance in communication, processing, and storage overheads, i.e., 13%, 19%, and 16%, respectively. Finally, the proposed hybrid approach is pruned against well-known security attacks in the IoT.展开更多
The Internet of Things(IoT)is a smart infrastructure where devices share captured data with the respective server or edge modules.However,secure and reliable communication is among the challenging tasks in these netwo...The Internet of Things(IoT)is a smart infrastructure where devices share captured data with the respective server or edge modules.However,secure and reliable communication is among the challenging tasks in these networks,as shared channels are used to transmit packets.In this paper,a decision tree is integrated with other metrics to form a secure distributed communication strategy for IoT.Initially,every device works collaboratively to form a distributed network.In this model,if a device is deployed outside the coverage area of the nearest server,it communicates indirectly through the neighboring devices.For this purpose,every device collects data from the respective neighboring devices,such as hop count,average packet transmission delay,criticality factor,link reliability,and RSSI value,etc.These parameters are used to find an optimal route from the source to the destination.Secondly,the proposed approach has enabled devices to learn from the environment and adjust the optimal route-finding formula accordingly.Moreover,these devices and server modules must ensure that every packet is transmitted securely,which is possible only if it is encrypted with an encryption algorithm.For this purpose,a decision tree-enabled device-to-server authentication algorithm is presented where every device and server must take part in the offline phase.Simulation results have verified that the proposed distributed communication approach has the potential to ensure the integrity and confidentiality of data during transmission.Moreover,the proposed approach has outperformed the existing approaches in terms of communication cost,processing overhead,end-to-end delay,packet loss ratio,and throughput.Finally,the proposed approach is adoptable in different networking infrastructures.展开更多
Operating in a body area network around a smartphone user, wearables serve a variety of commercial, medical and personal uses. Depending on a certain smartphone application, a wearable can capture sensitive data about...Operating in a body area network around a smartphone user, wearables serve a variety of commercial, medical and personal uses. Depending on a certain smartphone application, a wearable can capture sensitive data about the user and provide critical, possibly life-or-death, functionality. When using wearables, security problems might occur on hardware/software of wearables, connected phone apps or web services devices, or Bluetooth channels used for communication. This paper develops an open source platform called SecuWear for identifying vulnerabilities in these areas and facilitating wearable security research to mitigate them. SecuWear supports the creation, evaluation, and analysis of security vulnerability tests on actual hardwares. Extending earlier results, this paper includes an empirical evaluation that demonstrates proof of concept attacks on commercial wearable devices and shows how SecuWear captures the information necessary for identifying such attacks. Also included is a process for releasing attack and mitigation information to the security community.展开更多
IoT devices rely on authentication mechanisms to render secure message exchange.During data transmission,scalability,data integrity,and processing time have been considered challenging aspects for a system constituted...IoT devices rely on authentication mechanisms to render secure message exchange.During data transmission,scalability,data integrity,and processing time have been considered challenging aspects for a system constituted by IoT devices.The application of physical unclonable functions(PUFs)ensures secure data transmission among the internet of things(IoT)devices in a simplified network with an efficient time-stamped agreement.This paper proposes a secure,lightweight,cost-efficient reinforcement machine learning framework(SLCR-MLF)to achieve decentralization and security,thus enabling scalability,data integrity,and optimized processing time in IoT devices.PUF has been integrated into SLCR-MLF to improve the security of the cluster head node in the IoT platform during transmission by providing the authentication service for device-to-device communication.An IoT network gathers information of interest from multiple cluster members selected by the proposed framework.In addition,the software-defined secured(SDS)technique is integrated with SLCR-MLF to improve data integrity and optimize processing time in the IoT platform.Simulation analysis shows that the proposed framework outperforms conventional methods regarding the network’s lifetime,energy,secured data retrieval rate,and performance ratio.By enabling the proposed framework,number of residual nodes is reduced to 16%,energy consumption is reduced by up to 50%,almost 30%improvement in data retrieval rate,and network lifetime is improved by up to 1000 msec.展开更多
Cookies are considered a fundamental means of web application services for authenticating various Hypertext Transfer Protocol(HTTP)requests andmaintains the states of clients’information over the Internet.HTTP cookie...Cookies are considered a fundamental means of web application services for authenticating various Hypertext Transfer Protocol(HTTP)requests andmaintains the states of clients’information over the Internet.HTTP cookies are exploited to carry client patterns observed by a website.These client patterns facilitate the particular client’s future visit to the corresponding website.However,security and privacy are the primary concerns owing to the value of information over public channels and the storage of client information on the browser.Several protocols have been introduced that maintain HTTP cookies,but many of those fail to achieve the required security,or require a lot of resource overheads.In this article,we have introduced a lightweight Elliptic Curve Cryptographic(ECC)based protocol for authenticating client and server transactions to maintain the privacy and security of HTTP cookies.Our proposed protocol uses a secret key embedded within a cookie.The proposed protocol ismore efficient and lightweight than related protocols because of its reduced computation,storage,and communication costs.Moreover,the analysis presented in this paper confirms that proposed protocol resists various known attacks.展开更多
There has been disagreement over the value of purchasing space in the metaverse, but many businesses including Nike, The Wendy’s Company, and McDonald’s have jumped in headfirst. While the metaverse land rush has be...There has been disagreement over the value of purchasing space in the metaverse, but many businesses including Nike, The Wendy’s Company, and McDonald’s have jumped in headfirst. While the metaverse land rush has been called an “illusion” given underdeveloped infrastructure, including inadequate software and servers, and the potential opportunities for economic and legal abuse, the “real estate of the future” shows no signs of slowing. While the current virtual space of the metaverse is worth $6.30 billion, that is expected to grow to $84.09 billion by the end of 2028. But the long-term legal and regulatory considerations of capitalizing on the investment, as well as the manner in which blockchain technology can secure users’ data and digital assets, has yet to be properly investigated. With the metaverse still in a conceptual phase, building a new 3D social environment capable of digital transactions will represent most of the initial investment in time in human capital. Digital twin technologies, already well-established in industry, will be ported to support the need to architect and furnish the new digital world. The return on and viability of investing in the “real estate of the future” raises questions fundamental to the success or failure of the enterprise. As such this paper proposes a novel framing of the issue and looks at the intersection where finance, technology, and law are converging to prevent another Dot-com bubble of the late 1990s in metaverse-based virtual real estate transactions. Furthermore, the paper will argue that these domains are technologically feasible, but the main challenges for commercial users remain in the legal and regulatory arenas. As has been the case with the emergence of online commerce, a legal assessment of the metaverse indicates that courts will look to traditional and established legal principles when addressing issues until the enactment of federal and/or state statutes and accompanying regulations. Lastly, whereas traditional regulation of real estate would involve property law, the current legal framing of ownership of metaverse assets is governed by contract law.展开更多
Distributed Denial of Service(DDoS)attacks have always been a major concern in the security field.With the release of malware source codes such as BASHLITE and Mirai,Internet of Things(IoT)devices have become the new ...Distributed Denial of Service(DDoS)attacks have always been a major concern in the security field.With the release of malware source codes such as BASHLITE and Mirai,Internet of Things(IoT)devices have become the new source of DDoS attacks against many Internet applications.Although there are many datasets in the field of IoT intrusion detection,such as Bot-IoT,ConstrainedApplication Protocol–Denial of Service(CoAPDoS),and LATAM-DDoS-IoT(some of the names of DDoS datasets),which mainly focus on DDoS attacks,the datasets describing new IoT DDoS attack scenarios are extremely rare,and only N-BaIoT and IoT-23 datasets used IoT devices as DDoS attackers in the construction process,while they did not use Internet applications as victims either.To supplement the description of the new trend of DDoS attacks in the dataset,we built an IoT environment with mainstream DDoS attack tools such as Mirai and BASHLITE being used to infect IoT devices and implement DDoS attacks against WEB servers.Then,data aggregated into a dataset namedMBB-IoTwere captured atWEBservers and IoT nodes.After the MBB-IoT dataset was split into a training set and a test set,it was applied to the training and testing of the Random Forests classification algorithm.The multi-class classification metrics were good and all above 90%.Secondly,in a cross-evaluation experiment based on Support Vector Machine(SVM),Light Gradient Boosting Machine(LightGBM),and Long Short Term Memory networks(LSTM)classification algorithms,the training set and test set were derived from different datasets(MBB-IoT or IoT-23),and the test performance is better when MBB-IoT is used as the training set.展开更多
The applications of machine learning(ML)in the medical domain are often hindered by the limited availability of high-quality data.To address this challenge,we explore the synthetic generation of echocardiography image...The applications of machine learning(ML)in the medical domain are often hindered by the limited availability of high-quality data.To address this challenge,we explore the synthetic generation of echocardiography images(echoCG)using state-of-the-art generative models.We conduct a comprehensive evaluation of three prominent methods:Cycle-consistent generative adversarial network(CycleGAN),Contrastive Unpaired Translation(CUT),and Stable Diffusion 1.5 with Low-Rank Adaptation(LoRA).Our research presents the data generation methodol-ogy,image samples,and evaluation strategy,followed by an extensive user study involving licensed cardiologists and surgeons who assess the perceived quality and medical soundness of the generated images.Our findings indicate that Stable Diffusion outperforms both CycleGAN and CUT in generating images that are nearly indistinguishable from real echoCG images,making it a promising tool for augmenting medical datasets.However,we also identify limitations in the synthetic images generated by CycleGAN and CUT,which are easily distinguishable as non-realistic by medical professionals.This study highlights the potential of diffusion models in medical imaging and their applicability in addressing data scarcity,while also outlining the areas for future improvement.展开更多
More businesses are deploying powerful Intrusion Detection Systems(IDS)to secure their data and physical assets.Improved cyber-attack detection and prevention in these systems requires machine learning(ML)approaches.T...More businesses are deploying powerful Intrusion Detection Systems(IDS)to secure their data and physical assets.Improved cyber-attack detection and prevention in these systems requires machine learning(ML)approaches.This paper examines a cyber-attack prediction system combining feature selection(FS)and ML.Our technique’s foundation was based on Correlation Analysis(CA),Mutual Information(MI),and recursive feature reduction with cross-validation.To optimize the IDS performance,the security features must be carefully selected from multiple-dimensional datasets,and our hybrid FS technique must be extended to validate our methodology using the improved UNSW-NB 15 and TON_IoT datasets.Our technique identified 22 key characteristics in UNSW-NB-15 and 8 in TON_IoT.We evaluated prediction using seven ML methods:Decision Tree(DT),Random Forest(RF),Logistic Regression(LR),Naive Bayes(NB),K-Nearest Neighbors(KNN),Support Vector Machines(SVM),and Multilayer Perceptron(MLP)classifiers.The DT,RF,NB,and MLP classifiers helped our model surpass the competition on both datasets.Therefore,the investigational outcomes of our hybrid model may help IDSs defend business assets from various cyberattack vectors.展开更多
Sentiment analysis or Opinion Mining (OM) has gained significant interest among research communities and entrepreneurs in the recentyears. Likewise, Machine Learning (ML) approaches is one of the interestingresearch d...Sentiment analysis or Opinion Mining (OM) has gained significant interest among research communities and entrepreneurs in the recentyears. Likewise, Machine Learning (ML) approaches is one of the interestingresearch domains that are highly helpful and are increasingly applied in severalbusiness domains. In this background, the current research paper focuses onthe design of automated opinion mining model using Deer Hunting Optimization Algorithm (DHOA) with Fuzzy Neural Network (FNN) abbreviatedas DHOA-FNN model. The proposed DHOA-FNN technique involves fourdifferent stages namely, preprocessing, feature extraction, classification, andparameter tuning. In addition to the above, the proposed DHOA-FNN modelhas two stages of feature extraction namely, Glove and N-gram approach.Moreover, FNN model is utilized as a classification model whereas GTOA isused for the optimization of parameters. The novelty of current work is thatthe GTOA is designed to tune the parameters of FNN model. An extensiverange of simulations was carried out on the benchmark dataset and the resultswere examined under diverse measures. The experimental results highlightedthe promising performance of DHOA-FNN model over recent state-of-the-arttechniques with a maximum accuracy of 0.9928.展开更多
In various fields,different networks are used,most of the time not of a single kind;but rather a mix of at least two networks.These kinds of networks are called bridge networks which are utilized in interconnection ne...In various fields,different networks are used,most of the time not of a single kind;but rather a mix of at least two networks.These kinds of networks are called bridge networks which are utilized in interconnection networks of PC,portable networks,spine of internet,networks engaged with advanced mechanics,power generation interconnection,bio-informatics and substance intensify structures.Any number that can be entirely calculated by a graph is called graph invariants.Countless mathematical graph invariants have been portrayed and utilized for connection investigation during the latest twenty years.Nevertheless,no trustworthy evaluation has been embraced to pick,how much these invariants are associated with a network graph or subatomic graph.In this paper,it will discuss three unmistakable varieties of bridge networks with an incredible capacity of assumption in the field of computer science,chemistry,physics,drug industry,informatics and arithmetic in setting with physical and manufactured developments and networks,since Contraharmonic-quadratic invariants(CQIs)are recently presented and have different figure qualities for different varieties of bridge graphs or networks.The study settled the geography of bridge graphs/networks of three novel sorts with two kinds of CQI and Quadratic-Contraharmonic Indices(QCIs).The deduced results can be used for the modeling of the above-mentioned networks.展开更多
This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schem...This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schemes like tf-idf and BM25.These conventional methods often struggle with accurately capturing document relevance,leading to inefficiencies in both retrieval performance and index size management.OWS proposes a dynamic weighting mechanism that evaluates the significance of terms based on their orbital position within the vector space,emphasizing term relationships and distribution patterns overlooked by existing models.Our research focuses on evaluating OWS’s impact on model accuracy using Information Retrieval metrics like Recall,Precision,InterpolatedAverage Precision(IAP),andMeanAverage Precision(MAP).Additionally,we assessOWS’s effectiveness in reducing the inverted index size,crucial for model efficiency.We compare OWS-based retrieval models against others using different schemes,including tf-idf variations and BM25Delta.Results reveal OWS’s superiority,achieving a 54%Recall and 81%MAP,and a notable 38%reduction in the inverted index size.This highlights OWS’s potential in optimizing retrieval processes and underscores the need for further research in this underrepresented area to fully leverage OWS’s capabilities in information retrieval methodologies.展开更多
The demand for the telecommunication services,such as IP telephony,has increased dramatically during the COVID-19 pandemic lockdown.IP tele-phony should be enhanced to provide the expected quality.One of the issues th...The demand for the telecommunication services,such as IP telephony,has increased dramatically during the COVID-19 pandemic lockdown.IP tele-phony should be enhanced to provide the expected quality.One of the issues that should be investigated in IP telephony is bandwidth utilization.IP telephony pro-duces very small speech samples attached to a large packet header.The header of the IP telephony consumes a considerable share of the bandwidth allotted to the IP telephony.This wastes the network's bandwidth and influences the IP telephony quality.This paper proposes a mechanism(called Smallerize)that reduces the bandwidth consumed by both the speech sample and the header.This is achieved by assembling numerous IP telephony packets in one header and use the header'sfields to carry the speech sample.Several metrics have been used to measure the achievement Smallerize mechanism.The number of calls has been increased by 245.1%compared to the typical mechanism.The bandwidth saving has also reached 68%with the G.28 codec.Therefore,Smallerize is a possible mechanism to enhance bandwidth utilization of the IP telephony.展开更多
The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security mod...The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.展开更多
This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of inte...This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of internet criminals in the United States. The study adopted a survey research design, collecting data from 890 cloud professionals with relevant knowledge of cybersecurity and cloud computing. A machine learning approach was adopted, specifically a random forest classifier, an ensemble, and a decision tree model. Out of the features in the data, ten important features were selected using random forest feature importance, which helps to achieve the objective of the study. The study’s purpose is to enable organizations to develop suitable techniques to prevent cybercrime using random forest predictions as they relate to cloud services in the United States. The effectiveness of the models used is evaluated by utilizing validation matrices that include recall values, accuracy, and precision, in addition to F1 scores and confusion matrices. Based on evaluation scores (accuracy, precision, recall, and F1 scores) of 81.9%, 82.6%, and 82.1%, the results demonstrated the effectiveness of the random forest model. It showed the importance of machine learning algorithms in preventing cybercrime and boosting security in the cloud environment. It recommends that other machine learning models be adopted to see how to improve cybersecurity through cloud computing.展开更多
Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of S...Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of States and government all over Africa instigated the African Union (AU) Agenda 2063, which is a framework put in place to achieve a continental transformation over the next 40 years. The use of satellites has been proven to be a major influence on economic growth since it facilitates the exchange of information. Environmental hazards such as climate changes, pollution, and inefficient waste management can be classified as one of the drawbacks to achieving this economic growth we hope to accomplish. The purpose of this paper is to analyze and examine satellite communication as a tool for the attainment of an integrated, prosperous and peaceful Africa by means of combatting environmental hazards in the continent.展开更多
E-administration is performing administrative works via computer and its associated technologies such as the Internet. It is administrative efforts that center on the exchange of information and providing services to ...E-administration is performing administrative works via computer and its associated technologies such as the Internet. It is administrative efforts that center on the exchange of information and providing services to people and the business sector at high speed and low cost through computers and networks with the assurance of maintaining information security. It is based on the positive investment in information technology and communication in administrative practices. This paper presents the design of the e-administration platform that adopts the concept of cryptography for identity management. The architectural framework of the platform comprises subcomponents for service and forms identification, business process redesign, service architecture, amalgamation, and deployment. The cryptography model for securing the platform was designed based on the combination of authentication criteria presented in the Rijndael-Advanced Encryption Standard (AES), Lattice-based cryptography (LBC), and Secure Hash Algorithm (SHA512). It is required that a record be encrypted prior to its commitment to the database via a double encryption method. The AES algorithm-based encryption’s output will form the input to the LBC algorithm to obtain the final output.展开更多
基金the Deanship of Graduate Studies and Scientific Research at Qassim University for financial support(QU-APC-2024-9/1).
文摘Control signaling is mandatory for the operation and management of all types of communication networks,including the Third Generation Partnership Project(3GPP)mobile broadband networks.However,they consume important and scarce network resources such as bandwidth and processing power.There have been several reports of these control signaling turning into signaling storms halting network operations and causing the respective Telecom companies big financial losses.This paper draws its motivation from such real network disaster incidents attributed to signaling storms.In this paper,we present a thorough survey of the causes,of the signaling storm problems in 3GPP-based mobile broadband networks and discuss in detail their possible solutions and countermeasures.We provide relevant analytical models to help quantify the effect of the potential causes and benefits of their corresponding solutions.Another important contribution of this paper is the comparison of the possible causes and solutions/countermeasures,concerning their effect on several important network aspects such as architecture,additional signaling,fidelity,etc.,in the form of a table.This paper presents an update and an extension of our earlier conference publication.To our knowledge,no similar survey study exists on the subject.
基金Deputyship for Research&Innovation,Ministry of Education in Saudi Arabia,project number MoE-IF-UJ-R2-22-20772-1.
文摘The transportation and logistics sectors are major contributors to Greenhouse Gase(GHG)emissions.Carbon dioxide(CO_(2))from Light-Duty Vehicles(LDVs)is posing serious risks to air quality and public health.Understanding the extent of LDVs’impact on climate change and human well-being is crucial for informed decisionmaking and effective mitigation strategies.This study investigates the predictability of CO_(2)emissions from LDVs using a comprehensive dataset that includes vehicles from various manufacturers,their CO_(2)emission levels,and key influencing factors.Specifically,sixMachine Learning(ML)algorithms,ranging fromsimple linearmodels to complex non-linear models,were applied under identical conditions to ensure a fair comparison and their performance metrics were calculated.The obtained results showed a significant influence of variables such as engine size on CO_(2)emissions.Although the six algorithms have provided accurate forecasts,the Linear Regression(LR)model was found to be sufficient,achieving a Mean Absolute Percentage Error(MAPE)below 0.90%and a Coefficient of Determination(R2)exceeding 99.7%.These findings may contribute to a deeper understanding of LDVs’role in CO_(2)emissions and offer actionable insights for reducing their environmental impact.In fact,vehicle manufacturers can leverage these insights to target key emission-related factors,while policymakers and stakeholders in logistics and transportation can use the models to estimate the CO_(2)emissions of new vehicles before their market deployment or to project future emissions from current and expected LDV fleets.
基金the Deanship of Graduate Studies and Scientific Research at Qassim University for financial support(QU-APC-2025).
文摘Leaf disease identification is one of the most promising applications of convolutional neural networks(CNNs).This method represents a significant step towards revolutionizing agriculture by enabling the quick and accurate assessment of plant health.In this study,a CNN model was specifically designed and tested to detect and categorize diseases on fig tree leaves.The researchers utilized a dataset of 3422 images,divided into four classes:healthy,fig rust,fig mosaic,and anthracnose.These diseases can significantly reduce the yield and quality of fig tree fruit.The objective of this research is to develop a CNN that can identify and categorize diseases in fig tree leaves.The data for this study was collected from gardens in the Amandi and Mamash Khail Bannu districts of the Khyber Pakhtunkhwa region in Pakistan.To minimize the risk of overfitting and enhance the model’s performance,early stopping techniques and data augmentation were employed.As a result,the model achieved a training accuracy of 91.53%and a validation accuracy of 90.12%,which are considered respectable.This comprehensive model assists farmers in the early identification and categorization of fig tree leaf diseases.Our experts believe that CNNs could serve as valuable tools for accurate disease classification and detection in precision agriculture.We recommend further research to explore additional data sources and more advanced neural networks to improve the model’s accuracy and applicability.Future research will focus on expanding the dataset by including new diseases and testing the model in real-world scenarios to enhance sustainable farming practices.
基金supported by the Deanship of Graduate Studies and Scientific Research at Qassim University via Grant No.(QU-APC-2025).
文摘The Internet of Things (IoT) and edge-assisted networking infrastructures are capable of bringing data processing and accessibility services locally at the respective edge rather than at a centralized module. These infrastructures are very effective in providing a fast response to the respective queries of the requesting modules, but their distributed nature has introduced other problems such as security and privacy. To address these problems, various security-assisted communication mechanisms have been developed to safeguard every active module, i.e., devices and edges, from every possible vulnerability in the IoT. However, these methodologies have neglected one of the critical issues, which is the prediction of fraudulent devices, i.e., adversaries, preferably as early as possible in the IoT. In this paper, a hybrid communication mechanism is presented where the Hidden Markov Model (HMM) predicts the legitimacy of the requesting device (both source and destination), and the Advanced Encryption Standard (AES) safeguards the reliability of the transmitted data over a shared communication medium, preferably through a secret shared key, i.e., , and timestamp information. A device becomes trusted if it has passed both evaluation levels, i.e., HMM and message decryption, within a stipulated time interval. The proposed hybrid, along with existing state-of-the-art approaches, has been simulated in the realistic environment of the IoT to verify the security measures. These evaluations were carried out in the presence of intruders capable of launching various attacks simultaneously, such as man-in-the-middle, device impersonations, and masquerading attacks. Moreover, the proposed approach has been proven to be more effective than existing state-of-the-art approaches due to its exceptional performance in communication, processing, and storage overheads, i.e., 13%, 19%, and 16%, respectively. Finally, the proposed hybrid approach is pruned against well-known security attacks in the IoT.
基金supported by the Princess Nourah bint Abdulrahman University Riyadh,Saudi Arabia,through Project number(PNURSP2025R235).
文摘The Internet of Things(IoT)is a smart infrastructure where devices share captured data with the respective server or edge modules.However,secure and reliable communication is among the challenging tasks in these networks,as shared channels are used to transmit packets.In this paper,a decision tree is integrated with other metrics to form a secure distributed communication strategy for IoT.Initially,every device works collaboratively to form a distributed network.In this model,if a device is deployed outside the coverage area of the nearest server,it communicates indirectly through the neighboring devices.For this purpose,every device collects data from the respective neighboring devices,such as hop count,average packet transmission delay,criticality factor,link reliability,and RSSI value,etc.These parameters are used to find an optimal route from the source to the destination.Secondly,the proposed approach has enabled devices to learn from the environment and adjust the optimal route-finding formula accordingly.Moreover,these devices and server modules must ensure that every packet is transmitted securely,which is possible only if it is encrypted with an encryption algorithm.For this purpose,a decision tree-enabled device-to-server authentication algorithm is presented where every device and server must take part in the offline phase.Simulation results have verified that the proposed distributed communication approach has the potential to ensure the integrity and confidentiality of data during transmission.Moreover,the proposed approach has outperformed the existing approaches in terms of communication cost,processing overhead,end-to-end delay,packet loss ratio,and throughput.Finally,the proposed approach is adoptable in different networking infrastructures.
文摘Operating in a body area network around a smartphone user, wearables serve a variety of commercial, medical and personal uses. Depending on a certain smartphone application, a wearable can capture sensitive data about the user and provide critical, possibly life-or-death, functionality. When using wearables, security problems might occur on hardware/software of wearables, connected phone apps or web services devices, or Bluetooth channels used for communication. This paper develops an open source platform called SecuWear for identifying vulnerabilities in these areas and facilitating wearable security research to mitigate them. SecuWear supports the creation, evaluation, and analysis of security vulnerability tests on actual hardwares. Extending earlier results, this paper includes an empirical evaluation that demonstrates proof of concept attacks on commercial wearable devices and shows how SecuWear captures the information necessary for identifying such attacks. Also included is a process for releasing attack and mitigation information to the security community.
文摘IoT devices rely on authentication mechanisms to render secure message exchange.During data transmission,scalability,data integrity,and processing time have been considered challenging aspects for a system constituted by IoT devices.The application of physical unclonable functions(PUFs)ensures secure data transmission among the internet of things(IoT)devices in a simplified network with an efficient time-stamped agreement.This paper proposes a secure,lightweight,cost-efficient reinforcement machine learning framework(SLCR-MLF)to achieve decentralization and security,thus enabling scalability,data integrity,and optimized processing time in IoT devices.PUF has been integrated into SLCR-MLF to improve the security of the cluster head node in the IoT platform during transmission by providing the authentication service for device-to-device communication.An IoT network gathers information of interest from multiple cluster members selected by the proposed framework.In addition,the software-defined secured(SDS)technique is integrated with SLCR-MLF to improve data integrity and optimize processing time in the IoT platform.Simulation analysis shows that the proposed framework outperforms conventional methods regarding the network’s lifetime,energy,secured data retrieval rate,and performance ratio.By enabling the proposed framework,number of residual nodes is reduced to 16%,energy consumption is reduced by up to 50%,almost 30%improvement in data retrieval rate,and network lifetime is improved by up to 1000 msec.
基金support from Abu Dhabi University’s Office of Research and Sponsored Programs Grant Number:19300810.
文摘Cookies are considered a fundamental means of web application services for authenticating various Hypertext Transfer Protocol(HTTP)requests andmaintains the states of clients’information over the Internet.HTTP cookies are exploited to carry client patterns observed by a website.These client patterns facilitate the particular client’s future visit to the corresponding website.However,security and privacy are the primary concerns owing to the value of information over public channels and the storage of client information on the browser.Several protocols have been introduced that maintain HTTP cookies,but many of those fail to achieve the required security,or require a lot of resource overheads.In this article,we have introduced a lightweight Elliptic Curve Cryptographic(ECC)based protocol for authenticating client and server transactions to maintain the privacy and security of HTTP cookies.Our proposed protocol uses a secret key embedded within a cookie.The proposed protocol ismore efficient and lightweight than related protocols because of its reduced computation,storage,and communication costs.Moreover,the analysis presented in this paper confirms that proposed protocol resists various known attacks.
文摘There has been disagreement over the value of purchasing space in the metaverse, but many businesses including Nike, The Wendy’s Company, and McDonald’s have jumped in headfirst. While the metaverse land rush has been called an “illusion” given underdeveloped infrastructure, including inadequate software and servers, and the potential opportunities for economic and legal abuse, the “real estate of the future” shows no signs of slowing. While the current virtual space of the metaverse is worth $6.30 billion, that is expected to grow to $84.09 billion by the end of 2028. But the long-term legal and regulatory considerations of capitalizing on the investment, as well as the manner in which blockchain technology can secure users’ data and digital assets, has yet to be properly investigated. With the metaverse still in a conceptual phase, building a new 3D social environment capable of digital transactions will represent most of the initial investment in time in human capital. Digital twin technologies, already well-established in industry, will be ported to support the need to architect and furnish the new digital world. The return on and viability of investing in the “real estate of the future” raises questions fundamental to the success or failure of the enterprise. As such this paper proposes a novel framing of the issue and looks at the intersection where finance, technology, and law are converging to prevent another Dot-com bubble of the late 1990s in metaverse-based virtual real estate transactions. Furthermore, the paper will argue that these domains are technologically feasible, but the main challenges for commercial users remain in the legal and regulatory arenas. As has been the case with the emergence of online commerce, a legal assessment of the metaverse indicates that courts will look to traditional and established legal principles when addressing issues until the enactment of federal and/or state statutes and accompanying regulations. Lastly, whereas traditional regulation of real estate would involve property law, the current legal framing of ownership of metaverse assets is governed by contract law.
文摘Distributed Denial of Service(DDoS)attacks have always been a major concern in the security field.With the release of malware source codes such as BASHLITE and Mirai,Internet of Things(IoT)devices have become the new source of DDoS attacks against many Internet applications.Although there are many datasets in the field of IoT intrusion detection,such as Bot-IoT,ConstrainedApplication Protocol–Denial of Service(CoAPDoS),and LATAM-DDoS-IoT(some of the names of DDoS datasets),which mainly focus on DDoS attacks,the datasets describing new IoT DDoS attack scenarios are extremely rare,and only N-BaIoT and IoT-23 datasets used IoT devices as DDoS attackers in the construction process,while they did not use Internet applications as victims either.To supplement the description of the new trend of DDoS attacks in the dataset,we built an IoT environment with mainstream DDoS attack tools such as Mirai and BASHLITE being used to infect IoT devices and implement DDoS attacks against WEB servers.Then,data aggregated into a dataset namedMBB-IoTwere captured atWEBservers and IoT nodes.After the MBB-IoT dataset was split into a training set and a test set,it was applied to the training and testing of the Random Forests classification algorithm.The multi-class classification metrics were good and all above 90%.Secondly,in a cross-evaluation experiment based on Support Vector Machine(SVM),Light Gradient Boosting Machine(LightGBM),and Long Short Term Memory networks(LSTM)classification algorithms,the training set and test set were derived from different datasets(MBB-IoT or IoT-23),and the test performance is better when MBB-IoT is used as the training set.
基金funded by the Science Committee of the Ministry of Science and Higher Education of the Republic of Kazakhstan(Grant No.AP13068032-Development of Methods and Algorithms for Machine Learning for Predicting Pathologies of the Cardiovascular System Based on Echocardiography and Electrocardiography).
文摘The applications of machine learning(ML)in the medical domain are often hindered by the limited availability of high-quality data.To address this challenge,we explore the synthetic generation of echocardiography images(echoCG)using state-of-the-art generative models.We conduct a comprehensive evaluation of three prominent methods:Cycle-consistent generative adversarial network(CycleGAN),Contrastive Unpaired Translation(CUT),and Stable Diffusion 1.5 with Low-Rank Adaptation(LoRA).Our research presents the data generation methodol-ogy,image samples,and evaluation strategy,followed by an extensive user study involving licensed cardiologists and surgeons who assess the perceived quality and medical soundness of the generated images.Our findings indicate that Stable Diffusion outperforms both CycleGAN and CUT in generating images that are nearly indistinguishable from real echoCG images,making it a promising tool for augmenting medical datasets.However,we also identify limitations in the synthetic images generated by CycleGAN and CUT,which are easily distinguishable as non-realistic by medical professionals.This study highlights the potential of diffusion models in medical imaging and their applicability in addressing data scarcity,while also outlining the areas for future improvement.
文摘More businesses are deploying powerful Intrusion Detection Systems(IDS)to secure their data and physical assets.Improved cyber-attack detection and prevention in these systems requires machine learning(ML)approaches.This paper examines a cyber-attack prediction system combining feature selection(FS)and ML.Our technique’s foundation was based on Correlation Analysis(CA),Mutual Information(MI),and recursive feature reduction with cross-validation.To optimize the IDS performance,the security features must be carefully selected from multiple-dimensional datasets,and our hybrid FS technique must be extended to validate our methodology using the improved UNSW-NB 15 and TON_IoT datasets.Our technique identified 22 key characteristics in UNSW-NB-15 and 8 in TON_IoT.We evaluated prediction using seven ML methods:Decision Tree(DT),Random Forest(RF),Logistic Regression(LR),Naive Bayes(NB),K-Nearest Neighbors(KNN),Support Vector Machines(SVM),and Multilayer Perceptron(MLP)classifiers.The DT,RF,NB,and MLP classifiers helped our model surpass the competition on both datasets.Therefore,the investigational outcomes of our hybrid model may help IDSs defend business assets from various cyberattack vectors.
基金Taif University Researchers Supporting Project Number(TURSP-2020/216),Taif University,Taif,Saudi Arabia.
文摘Sentiment analysis or Opinion Mining (OM) has gained significant interest among research communities and entrepreneurs in the recentyears. Likewise, Machine Learning (ML) approaches is one of the interestingresearch domains that are highly helpful and are increasingly applied in severalbusiness domains. In this background, the current research paper focuses onthe design of automated opinion mining model using Deer Hunting Optimization Algorithm (DHOA) with Fuzzy Neural Network (FNN) abbreviatedas DHOA-FNN model. The proposed DHOA-FNN technique involves fourdifferent stages namely, preprocessing, feature extraction, classification, andparameter tuning. In addition to the above, the proposed DHOA-FNN modelhas two stages of feature extraction namely, Glove and N-gram approach.Moreover, FNN model is utilized as a classification model whereas GTOA isused for the optimization of parameters. The novelty of current work is thatthe GTOA is designed to tune the parameters of FNN model. An extensiverange of simulations was carried out on the benchmark dataset and the resultswere examined under diverse measures. The experimental results highlightedthe promising performance of DHOA-FNN model over recent state-of-the-arttechniques with a maximum accuracy of 0.9928.
基金the University of Jeddah,Jeddah,Saudi Arabia,under Grant No.(UJ-22-DR-14).
文摘In various fields,different networks are used,most of the time not of a single kind;but rather a mix of at least two networks.These kinds of networks are called bridge networks which are utilized in interconnection networks of PC,portable networks,spine of internet,networks engaged with advanced mechanics,power generation interconnection,bio-informatics and substance intensify structures.Any number that can be entirely calculated by a graph is called graph invariants.Countless mathematical graph invariants have been portrayed and utilized for connection investigation during the latest twenty years.Nevertheless,no trustworthy evaluation has been embraced to pick,how much these invariants are associated with a network graph or subatomic graph.In this paper,it will discuss three unmistakable varieties of bridge networks with an incredible capacity of assumption in the field of computer science,chemistry,physics,drug industry,informatics and arithmetic in setting with physical and manufactured developments and networks,since Contraharmonic-quadratic invariants(CQIs)are recently presented and have different figure qualities for different varieties of bridge graphs or networks.The study settled the geography of bridge graphs/networks of three novel sorts with two kinds of CQI and Quadratic-Contraharmonic Indices(QCIs).The deduced results can be used for the modeling of the above-mentioned networks.
文摘This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schemes like tf-idf and BM25.These conventional methods often struggle with accurately capturing document relevance,leading to inefficiencies in both retrieval performance and index size management.OWS proposes a dynamic weighting mechanism that evaluates the significance of terms based on their orbital position within the vector space,emphasizing term relationships and distribution patterns overlooked by existing models.Our research focuses on evaluating OWS’s impact on model accuracy using Information Retrieval metrics like Recall,Precision,InterpolatedAverage Precision(IAP),andMeanAverage Precision(MAP).Additionally,we assessOWS’s effectiveness in reducing the inverted index size,crucial for model efficiency.We compare OWS-based retrieval models against others using different schemes,including tf-idf variations and BM25Delta.Results reveal OWS’s superiority,achieving a 54%Recall and 81%MAP,and a notable 38%reduction in the inverted index size.This highlights OWS’s potential in optimizing retrieval processes and underscores the need for further research in this underrepresented area to fully leverage OWS’s capabilities in information retrieval methodologies.
文摘The demand for the telecommunication services,such as IP telephony,has increased dramatically during the COVID-19 pandemic lockdown.IP tele-phony should be enhanced to provide the expected quality.One of the issues that should be investigated in IP telephony is bandwidth utilization.IP telephony pro-duces very small speech samples attached to a large packet header.The header of the IP telephony consumes a considerable share of the bandwidth allotted to the IP telephony.This wastes the network's bandwidth and influences the IP telephony quality.This paper proposes a mechanism(called Smallerize)that reduces the bandwidth consumed by both the speech sample and the header.This is achieved by assembling numerous IP telephony packets in one header and use the header'sfields to carry the speech sample.Several metrics have been used to measure the achievement Smallerize mechanism.The number of calls has been increased by 245.1%compared to the typical mechanism.The bandwidth saving has also reached 68%with the G.28 codec.Therefore,Smallerize is a possible mechanism to enhance bandwidth utilization of the IP telephony.
基金the Deanship of Graduate Studies and Scientific Research at Qassim University for financial support(QU-APC-2025).
文摘The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.
文摘This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of internet criminals in the United States. The study adopted a survey research design, collecting data from 890 cloud professionals with relevant knowledge of cybersecurity and cloud computing. A machine learning approach was adopted, specifically a random forest classifier, an ensemble, and a decision tree model. Out of the features in the data, ten important features were selected using random forest feature importance, which helps to achieve the objective of the study. The study’s purpose is to enable organizations to develop suitable techniques to prevent cybercrime using random forest predictions as they relate to cloud services in the United States. The effectiveness of the models used is evaluated by utilizing validation matrices that include recall values, accuracy, and precision, in addition to F1 scores and confusion matrices. Based on evaluation scores (accuracy, precision, recall, and F1 scores) of 81.9%, 82.6%, and 82.1%, the results demonstrated the effectiveness of the random forest model. It showed the importance of machine learning algorithms in preventing cybercrime and boosting security in the cloud environment. It recommends that other machine learning models be adopted to see how to improve cybersecurity through cloud computing.
文摘Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of States and government all over Africa instigated the African Union (AU) Agenda 2063, which is a framework put in place to achieve a continental transformation over the next 40 years. The use of satellites has been proven to be a major influence on economic growth since it facilitates the exchange of information. Environmental hazards such as climate changes, pollution, and inefficient waste management can be classified as one of the drawbacks to achieving this economic growth we hope to accomplish. The purpose of this paper is to analyze and examine satellite communication as a tool for the attainment of an integrated, prosperous and peaceful Africa by means of combatting environmental hazards in the continent.
文摘E-administration is performing administrative works via computer and its associated technologies such as the Internet. It is administrative efforts that center on the exchange of information and providing services to people and the business sector at high speed and low cost through computers and networks with the assurance of maintaining information security. It is based on the positive investment in information technology and communication in administrative practices. This paper presents the design of the e-administration platform that adopts the concept of cryptography for identity management. The architectural framework of the platform comprises subcomponents for service and forms identification, business process redesign, service architecture, amalgamation, and deployment. The cryptography model for securing the platform was designed based on the combination of authentication criteria presented in the Rijndael-Advanced Encryption Standard (AES), Lattice-based cryptography (LBC), and Secure Hash Algorithm (SHA512). It is required that a record be encrypted prior to its commitment to the database via a double encryption method. The AES algorithm-based encryption’s output will form the input to the LBC algorithm to obtain the final output.