Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resourc...Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resource bottlenecks and increased energy consumption.This study aims to address these limitations by proposing the Quantum Inspired Adaptive Resource Management(QIARM)model,which introduces novel algorithms inspired by quantum principles for enhanced resource allocation.QIARM employs a quantum superposition-inspired technique for multi-state resource representation and an adaptive learning component to adjust resources in real time dynamically.In addition,an energy-aware scheduling module minimizes power consumption by selecting optimal configurations based on energy metrics.The simulation was carried out in a 360-minute environment with eight distinct scenarios.This study introduces a novel quantum-inspired resource management framework that achieves up to 98%task offload success and reduces energy consumption by 20%,addressing critical challenges of scalability and efficiency in dynamic fog computing environments.展开更多
With the rapid development of the Internet of Things(IoT),there are several challenges pertaining to security in IoT applications.Compared with the characteristics of the traditional Internet,the IoT has many problems...With the rapid development of the Internet of Things(IoT),there are several challenges pertaining to security in IoT applications.Compared with the characteristics of the traditional Internet,the IoT has many problems,such as large assets,complex and diverse structures,and lack of computing resources.Traditional network intrusion detection systems cannot meet the security needs of IoT applications.In view of this situation,this study applies cloud computing and machine learning to the intrusion detection system of IoT to improve detection performance.Usually,traditional intrusion detection algorithms require considerable time for training,and these intrusion detection algorithms are not suitable for cloud computing due to the limited computing power and storage capacity of cloud nodes;therefore,it is necessary to study intrusion detection algorithms with low weights,short training time,and high detection accuracy for deployment and application on cloud nodes.An appropriate classification algorithm is a primary factor for deploying cloud computing intrusion prevention systems and a prerequisite for the system to respond to intrusion and reduce intrusion threats.This paper discusses the problems related to IoT intrusion prevention in cloud computing environments.Based on the analysis of cloud computing security threats,this study extensively explores IoT intrusion detection,cloud node monitoring,and intrusion response in cloud computing environments by using cloud computing,an improved extreme learning machine,and other methods.We use the Multi-Feature Extraction Extreme Learning Machine(MFE-ELM)algorithm for cloud computing,which adds a multi-feature extraction process to cloud servers,and use the deployed MFE-ELM algorithm on cloud nodes to detect and discover network intrusions to cloud nodes.In our simulation experiments,a classical dataset for intrusion detection is selected as a test,and test steps such as data preprocessing,feature engineering,model training,and result analysis are performed.The experimental results show that the proposed algorithm can effectively detect and identify most network data packets with good model performance and achieve efficient intrusion detection for heterogeneous data of the IoT from cloud nodes.Furthermore,it can enable the cloud server to discover nodes with serious security threats in the cloud cluster in real time,so that further security protection measures can be taken to obtain the optimal intrusion response strategy for the cloud cluster.展开更多
In this paper,the Internet ofMedical Things(IoMT)is identified as a promising solution,which integrates with the cloud computing environment to provide remote health monitoring solutions and improve the quality of ser...In this paper,the Internet ofMedical Things(IoMT)is identified as a promising solution,which integrates with the cloud computing environment to provide remote health monitoring solutions and improve the quality of service(QoS)in the healthcare sector.However,problems with the present architectural models such as those related to energy consumption,service latency,execution cost,and resource usage,remain a major concern for adopting IoMT applications.To address these problems,this work presents a four-tier IoMT-edge-fog-cloud architecture along with an optimization model formulated using Mixed Integer Linear Programming(MILP),with the objective of efficiently processing and placing IoMT applications in the edge-fog-cloud computing environment,while maintaining certain quality standards(e.g.,energy consumption,service latency,network utilization).A modeling environment is used to assess and validate the proposed model by considering different traffic loads and processing requirements.In comparison to the other existing models,the performance analysis of the proposed approach shows a maximum saving of 38%in energy consumption and a 73%reduction in service latency.The results also highlight that offloading the IoMT application to the edge and fog nodes compared to the cloud is highly dependent on the tradeoff between the network journey time saved vs.the extra power consumed by edge or fog resources.展开更多
The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machin...The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.展开更多
Edge computing is swiftly gaining traction and is being standardised by the European Telecommunications Standards Institute(ETSI)as Multi-access Edge Computing(MEC).Simultaneously,oneM2M has been actively developing s...Edge computing is swiftly gaining traction and is being standardised by the European Telecommunications Standards Institute(ETSI)as Multi-access Edge Computing(MEC).Simultaneously,oneM2M has been actively developing standards for dynamic data management and IoT services at the edge,particularly for applications that require real-time support and security.Integrating MEC and oneM2M offers a unique opportunity to maximize their individual strengths.Therefore,this article proposes a framework that integrates MEC and oneM2M standard platforms for IoT applications,demonstrating how the synergy of these architectures can leverage the geographically distributed computing resources at base stations,enabling efficient deployment and added value for time-sensitive IoT applications.In addition,this study offers a concept of potential interworking models between oneM2M and the MEC architectures.The adoption of these standard architectures can enable various IoT edge services,such as smart city mobility and real-time analytics functions,by leveraging the oneM2M common service layer instantiated on the MEC host.展开更多
The rapid advancement of technology has paved the way for innovative approaches to education.Artificial intelligence(AI),the Internet of Things(IoT),and cloud computing are three transformative technologies reshaping ...The rapid advancement of technology has paved the way for innovative approaches to education.Artificial intelligence(AI),the Internet of Things(IoT),and cloud computing are three transformative technologies reshaping how education is delivered,accessed,and experienced.These technologies enable personalized learning,optimize teaching processes,and make educational resources more accessible to learners worldwide.This paper examines the integration of these technologies into smart education systems,highlighting their applications,benefits,and challenges,and exploring their potential to bridge gaps in educational equity and inclusivity.展开更多
Smart edge computing(SEC)is a novel paradigm for computing that could transfer cloud-based applications to the edge network,supporting computation-intensive services like face detection and natural language processing...Smart edge computing(SEC)is a novel paradigm for computing that could transfer cloud-based applications to the edge network,supporting computation-intensive services like face detection and natural language processing.A core feature of mobile edge computing,SEC improves user experience and device performance by offloading local activities to edge processors.In this framework,blockchain technology is utilized to ensure secure and trustworthy communication between edge devices and servers,protecting against potential security threats.Additionally,Deep Learning algorithms are employed to analyze resource availability and optimize computation offloading decisions dynamically.IoT applications that require significant resources can benefit from SEC,which has better coverage.Although access is constantly changing and network devices have heterogeneous resources,it is not easy to create consistent,dependable,and instantaneous communication between edge devices and their processors,specifically in 5G Heterogeneous Network(HN)situations.Thus,an Intelligent Management of Resources for Smart Edge Computing(IMRSEC)framework,which combines blockchain,edge computing,and Artificial Intelligence(AI)into 5G HNs,has been proposed in this paper.As a result,a unique dual schedule deep reinforcement learning(DS-DRL)technique has been developed,consisting of a rapid schedule learning process and a slow schedule learning process.The primary objective is to minimize overall unloading latency and system resource usage by optimizing computation offloading,resource allocation,and application caching.Simulation results demonstrate that the DS-DRL approach reduces task execution time by 32%,validating the method’s effectiveness within the IMRSEC framework.展开更多
The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security mod...The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.展开更多
Nowadays,Multi Robotic System(MRS)consisting of different robot shapes,sizes and capabilities has received significant attention from researchers and are being deployed in a variety of real-world applications.From sen...Nowadays,Multi Robotic System(MRS)consisting of different robot shapes,sizes and capabilities has received significant attention from researchers and are being deployed in a variety of real-world applications.From sensors and actuators improved by communication technologies to powerful computing systems utilizing advanced Artificial Intelligence(AI)algorithms have rapidly driven the development of MRS,so the Internet of Things(IoT)in MRS has become a new topic,namely the Internet of Robotic Things(IoRT).This paper summarizes a comprehensive survey of state-of-the-art technologies for mobile robots,including general architecture,benefits,challenges,practical applications,and future research directions.In addition,remarkable research of i)multirobot navigation,ii)network architecture,routing protocols and communications,and iii)coordination among robots as well as data analysis via external computing(cloud,fog,edge,edge-cloud)are merged with the IoRT architecture according to their applicability.Moreover,security is a long-term challenge for IoRT because of various attack vectors,security flaws,and vulnerabilities.Security threats,attacks,and existing solutions based on IoRT architectures are also under scrutiny.Moreover,the identification of environmental situations that are crucial for all types of IoRT applications,such as the detection of objects,human,and obstacles,is also critically reviewed.Finally,future research directions are given by analyzing the challenges of IoRT in mobile robots.展开更多
The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that trans...The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.展开更多
Due to its decentralized,tamper-proof,and trust-free characteristics,blockchain is used in the Internet of Things(IoT)to guarantee the reliability of data.However,some technical flaws in blockchain itself prevent the ...Due to its decentralized,tamper-proof,and trust-free characteristics,blockchain is used in the Internet of Things(IoT)to guarantee the reliability of data.However,some technical flaws in blockchain itself prevent the development of these applications,such as the issue with linearly growing storage capacity of blockchain systems.On the other hand,there is a lack of storage resources for sensor devices in IoT,and numerous sensor devices will generate massive data at ultra-high speed,which makes the storage problem of the IoT enabled by blockchain more prominent.There are various solutions to reduce the storage burden by modifying the blockchain’s storage policy,but most of them do not consider the willingness of peers.In attempt to make the blockchain more compatible with the IoT,this paper proposes a storage optimization scheme that revisits the system data storage problem from amore practically oriented standpoint.Peers will only store transactional data that they are directly involved in.In addition,a transaction verification model is developed to enable peers to undertake transaction verification with the aid of cloud computing,and an incentive mechanism is premised on the storage optimization scheme to assure data integrity.The results of the simulation experiments demonstrate the proposed scheme’s advantage in terms of storage and throughput.展开更多
Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications ...Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.展开更多
The Internet of Medical Things(Io MT) is regarded as a critical technology for intelligent healthcare in the foreseeable 6G era. Nevertheless, due to the limited computing power capability of edge devices and task-rel...The Internet of Medical Things(Io MT) is regarded as a critical technology for intelligent healthcare in the foreseeable 6G era. Nevertheless, due to the limited computing power capability of edge devices and task-related coupling relationships, Io MT faces unprecedented challenges. Considering the associative connections among tasks, this paper proposes a computing offloading policy for multiple-user devices(UDs) considering device-to-device(D2D) communication and a multi-access edge computing(MEC)technique under the scenario of Io MT. Specifically,to minimize the total delay and energy consumption concerning the requirement of Io MT, we first analyze and model the detailed local execution, MEC execution, D2D execution, and associated tasks offloading exchange model. Consequently, the associated tasks’ offloading scheme of multi-UDs is formulated as a mixed-integer nonconvex optimization problem. Considering the advantages of deep reinforcement learning(DRL) in processing tasks related to coupling relationships, a Double DQN based associative tasks computing offloading(DDATO) algorithm is then proposed to obtain the optimal solution, which can make the best offloading decision under the condition that tasks of UDs are associative. Furthermore, to reduce the complexity of the DDATO algorithm, the cacheaided procedure is intentionally introduced before the data training process. This avoids redundant offloading and computing procedures concerning tasks that previously have already been cached by other UDs. In addition, we use a dynamic ε-greedy strategy in the action selection section of the algorithm, thus preventing the algorithm from falling into a locally optimal solution. Simulation results demonstrate that compared with other existing methods for associative task models concerning different structures in the Io MT network, the proposed algorithm can lower the total cost more effectively and efficiently while also providing a tradeoff between delay and energy consumption tolerance.展开更多
In order to solve the high latency of traditional cloud computing and the processing capacity limitation of Internet of Things(IoT)users,Multi-access Edge Computing(MEC)migrates computing and storage capabilities from...In order to solve the high latency of traditional cloud computing and the processing capacity limitation of Internet of Things(IoT)users,Multi-access Edge Computing(MEC)migrates computing and storage capabilities from the remote data center to the edge of network,providing users with computation services quickly and directly.In this paper,we investigate the impact of the randomness caused by the movement of the IoT user on decision-making for offloading,where the connection between the IoT user and the MEC servers is uncertain.This uncertainty would be the main obstacle to assign the task accurately.Consequently,if the assigned task cannot match well with the real connection time,a migration(connection time is not enough to process)would be caused.In order to address the impact of this uncertainty,we formulate the offloading decision as an optimization problem considering the transmission,computation and migration.With the help of Stochastic Programming(SP),we use the posteriori recourse to compensate for inaccurate predictions.Meanwhile,in heterogeneous networks,considering multiple candidate MEC servers could be selected simultaneously due to overlapping,we also introduce the Multi-Arm Bandit(MAB)theory for MEC selection.The extensive simulations validate the improvement and effectiveness of the proposed SP-based Multi-arm bandit Method(SMM)for offloading in terms of reward,cost,energy consumption and delay.The results showthat SMMcan achieve about 20%improvement compared with the traditional offloading method that does not consider the randomness,and it also outperforms the existing SP/MAB based method for offloading.展开更多
With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is ...With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is integrated into IoT.However, how to efficiently use edge computing resources to process the computing tasks of blockchain from IoT devices has not been fully studied.In this paper, the MEC and blockchain-enhanced IoT is considered.The transactions recording the data or other application information are generated by the IoT devices, and they are offloaded to the MEC servers to join the blockchain.The practical Byzantine fault tolerance(PBFT) consensus mechanism is used among all the MEC servers which are also the blockchain nodes, and the latency of the consensus process is modeled with the consideration of characteristics of the wireless network.The joint optimization problem of serving base station(BS) selection and wireless transmission resources allocation is modeled as a Markov decision process(MDP), and the long-term system utility is defined based on task reward, credit value, the latency of infrastructure layer and blockchain layer, and computing cost.A double deep Q learning(DQN) based transactions offloading algorithm(DDQN-TOA) is proposed, and simulation results show the advantages of the proposed algorithm in comparison to other methods.展开更多
In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to im...In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to improve IIoT service efficiency.There are two types of costs for this kind of IoT network:a communication cost and a computing cost.For service efficiency,the communication cost of data transmission should be minimized,and the computing cost in the edge cloud should be also minimized.Therefore,in this paper,the communication cost for data transmission is defined as the delay factor,and the computing cost in the edge cloud is defined as the waiting time of the computing intensity.The proposed method selects an edge cloud that minimizes the total cost of the communication and computing costs.That is,a device chooses a routing path to the selected edge cloud based on the costs.The proposed method controls the data flows in a mesh-structured network and appropriately distributes the data processing load.The performance of the proposed method is validated through extensive computer simulation.When the transition probability from good to bad is 0.3 and the transition probability from bad to good is 0.7 in wireless and edge cloud states,the proposed method reduced both the average delay and the service pause counts to about 25%of the existing method.展开更多
Puncturing has been recognized as a promising technology to cope with the coexistence problem of enhanced mobile broadband(eMBB) and ultra-reliable low latency communications(URLLC)traffic. However, the steady perform...Puncturing has been recognized as a promising technology to cope with the coexistence problem of enhanced mobile broadband(eMBB) and ultra-reliable low latency communications(URLLC)traffic. However, the steady performance of eMBB traffic while meeting the requirements of URLLC traffic with puncturing is a major challenge in some realistic scenarios. In this paper, we pay attention to the timely and energy-efficient processing for eMBB traffic in the industrial Internet of Things(IIoT), where mobile edge computing(MEC) is employed for data processing. Specifically, the performance of eMBB traffic and URLLC traffic in a MEC-based IIoT system is ensured by setting the threshold of tolerable delay and outage probability, respectively. Furthermore,considering the limited energy supply, an energy minimization problem of eMBB device is formulated under the above constraints, by jointly optimizing the resource blocks(RBs) punctured by URLLC traffic, data offloading and transmit power of eMBB device. With Markov's inequality, the problem is reformulated by transforming the probabilistic outage constraint into a deterministic constraint. Meanwhile, an iterative energy minimization algorithm(IEMA) is proposed.Simulation results demonstrate that our algorithm has a significant reduction in the energy consumption for eMBB device and achieves a better overall effect compared to several benchmarks.展开更多
The Internet of Things (IoT) is emerging as an attractive paradigm involving physical perceptions, cyber interactions, social correlations and even cognitive thinking through a cyber-physical-social-thinking hyperspac...The Internet of Things (IoT) is emerging as an attractive paradigm involving physical perceptions, cyber interactions, social correlations and even cognitive thinking through a cyber-physical-social-thinking hyperspace. In this context, energy management with the purposes of energy saving and high efficiency is a challenging issue. In this work, a taxonomy model is established in reference to the IoT layers (i.e., sensor-actuator layer, network layer, and application layer), and IoT energy management is addressed from the perspectives of supply and demand to achieve green perception, communication, and computing. A smart home scenario is presented as a case study involving the main enabling technologies with supply-side, demand-side, and supply-demand balance considerations, and open issues in the field of IoT energy management are also discussed.展开更多
The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources...The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources to train and deploy an entire learning model.At the same time,transmitting continuous real-time data to a central server with high computing resource incurs enormous communication costs and raises issues in data security and privacy.Federated learning,a distributed machine learning framework,is a promising solution to train machine learning models with resource-limited devices and edge servers.Yet,the majority of existing works assume an impractically synchronous parameter update manner with homogeneous IoT nodes under stable communication connections.In this paper,we develop an asynchronous federated learning scheme to improve training efficiency for heterogeneous IoT devices under unstable communication network.Particularly,we formulate an asynchronous federated learning model and develop a lightweight node selection algorithm to carry out learning tasks effectively.The proposed algorithm iteratively selects heterogeneous IoT nodes to participate in the global learning aggregation while considering their local computing resource and communication condition.Extensive experimental results demonstrate that our proposed asynchronous federated learning scheme outperforms the state-of-the-art schemes in various settings on independent and identically distributed(i.i.d.)and non-i.i.d.data distribution.展开更多
Blockchain technology has become a research hotspot in recent years with the prominent characteristics as public,distributed and decentration.And blockchain-enabled internet of things(BIoT)has a tendency to make a rev...Blockchain technology has become a research hotspot in recent years with the prominent characteristics as public,distributed and decentration.And blockchain-enabled internet of things(BIoT)has a tendency to make a revolutionary change for the internet of things(IoT)which requires distributed trustless consensus.However,the scalability and security issues become particularly important with the dramatically increasing number of IoT devices.Especially,with the development of quantum computing,many extant cryptographic algorithms applied in blockchain or BIoT systems are vulnerable to the quantum attacks.In this paper,an anti-quantum proxy blind signature scheme based on the lattice cryptography has been proposed,which can provide user anonymity and untraceability in the distributed applications of BIoT.Then,the security proof of the proposed scheme can derive that it is secure in random oracle model,and the efficiency analysis can indicate it is efficient than other similar literatures.展开更多
基金funded by Researchers Supporting Project Number(RSPD2025R947)King Saud University,Riyadh,Saudi Arabia.
文摘Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resource bottlenecks and increased energy consumption.This study aims to address these limitations by proposing the Quantum Inspired Adaptive Resource Management(QIARM)model,which introduces novel algorithms inspired by quantum principles for enhanced resource allocation.QIARM employs a quantum superposition-inspired technique for multi-state resource representation and an adaptive learning component to adjust resources in real time dynamically.In addition,an energy-aware scheduling module minimizes power consumption by selecting optimal configurations based on energy metrics.The simulation was carried out in a 360-minute environment with eight distinct scenarios.This study introduces a novel quantum-inspired resource management framework that achieves up to 98%task offload success and reduces energy consumption by 20%,addressing critical challenges of scalability and efficiency in dynamic fog computing environments.
基金funded by the Key Research and Development plan of Jiangsu Province (Social Development)No.BE20217162Jiangsu Modern Agricultural Machinery Equipment and Technology Demonstration and Promotion Project No.NJ2021-19.
文摘With the rapid development of the Internet of Things(IoT),there are several challenges pertaining to security in IoT applications.Compared with the characteristics of the traditional Internet,the IoT has many problems,such as large assets,complex and diverse structures,and lack of computing resources.Traditional network intrusion detection systems cannot meet the security needs of IoT applications.In view of this situation,this study applies cloud computing and machine learning to the intrusion detection system of IoT to improve detection performance.Usually,traditional intrusion detection algorithms require considerable time for training,and these intrusion detection algorithms are not suitable for cloud computing due to the limited computing power and storage capacity of cloud nodes;therefore,it is necessary to study intrusion detection algorithms with low weights,short training time,and high detection accuracy for deployment and application on cloud nodes.An appropriate classification algorithm is a primary factor for deploying cloud computing intrusion prevention systems and a prerequisite for the system to respond to intrusion and reduce intrusion threats.This paper discusses the problems related to IoT intrusion prevention in cloud computing environments.Based on the analysis of cloud computing security threats,this study extensively explores IoT intrusion detection,cloud node monitoring,and intrusion response in cloud computing environments by using cloud computing,an improved extreme learning machine,and other methods.We use the Multi-Feature Extraction Extreme Learning Machine(MFE-ELM)algorithm for cloud computing,which adds a multi-feature extraction process to cloud servers,and use the deployed MFE-ELM algorithm on cloud nodes to detect and discover network intrusions to cloud nodes.In our simulation experiments,a classical dataset for intrusion detection is selected as a test,and test steps such as data preprocessing,feature engineering,model training,and result analysis are performed.The experimental results show that the proposed algorithm can effectively detect and identify most network data packets with good model performance and achieve efficient intrusion detection for heterogeneous data of the IoT from cloud nodes.Furthermore,it can enable the cloud server to discover nodes with serious security threats in the cloud cluster in real time,so that further security protection measures can be taken to obtain the optimal intrusion response strategy for the cloud cluster.
基金The authors extend their appreciation to the Deputyship for Research and Innovation,Ministry of Education in Saudi Arabia for funding this research work the project number(442/204).
文摘In this paper,the Internet ofMedical Things(IoMT)is identified as a promising solution,which integrates with the cloud computing environment to provide remote health monitoring solutions and improve the quality of service(QoS)in the healthcare sector.However,problems with the present architectural models such as those related to energy consumption,service latency,execution cost,and resource usage,remain a major concern for adopting IoMT applications.To address these problems,this work presents a four-tier IoMT-edge-fog-cloud architecture along with an optimization model formulated using Mixed Integer Linear Programming(MILP),with the objective of efficiently processing and placing IoMT applications in the edge-fog-cloud computing environment,while maintaining certain quality standards(e.g.,energy consumption,service latency,network utilization).A modeling environment is used to assess and validate the proposed model by considering different traffic loads and processing requirements.In comparison to the other existing models,the performance analysis of the proposed approach shows a maximum saving of 38%in energy consumption and a 73%reduction in service latency.The results also highlight that offloading the IoMT application to the edge and fog nodes compared to the cloud is highly dependent on the tradeoff between the network journey time saved vs.the extra power consumed by edge or fog resources.
文摘The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)-Information Technology Research Center(ITRC)grant funded by the Korea government(IITP-2025-RS-2021-II211816)supported by the National Research Foundation of Korea(NRF)grant(NRF-2023R1A2C1004453)+3 种基金funded by the European Union’s HORIZON-JUSNS-2023 HE research and innovation program(6G-Path project,Grant No.101139172)the Horizon 2020 Research and Innovation Program(aerOS project,Grant No.101069732)supported by the ESTIMED project,conducted by the ETSI Specialist Task Force 685(STF 685)funded by the European Commission(EC)and the European Free Trade Association(EFTA).
文摘Edge computing is swiftly gaining traction and is being standardised by the European Telecommunications Standards Institute(ETSI)as Multi-access Edge Computing(MEC).Simultaneously,oneM2M has been actively developing standards for dynamic data management and IoT services at the edge,particularly for applications that require real-time support and security.Integrating MEC and oneM2M offers a unique opportunity to maximize their individual strengths.Therefore,this article proposes a framework that integrates MEC and oneM2M standard platforms for IoT applications,demonstrating how the synergy of these architectures can leverage the geographically distributed computing resources at base stations,enabling efficient deployment and added value for time-sensitive IoT applications.In addition,this study offers a concept of potential interworking models between oneM2M and the MEC architectures.The adoption of these standard architectures can enable various IoT edge services,such as smart city mobility and real-time analytics functions,by leveraging the oneM2M common service layer instantiated on the MEC host.
文摘The rapid advancement of technology has paved the way for innovative approaches to education.Artificial intelligence(AI),the Internet of Things(IoT),and cloud computing are three transformative technologies reshaping how education is delivered,accessed,and experienced.These technologies enable personalized learning,optimize teaching processes,and make educational resources more accessible to learners worldwide.This paper examines the integration of these technologies into smart education systems,highlighting their applications,benefits,and challenges,and exploring their potential to bridge gaps in educational equity and inclusivity.
文摘Smart edge computing(SEC)is a novel paradigm for computing that could transfer cloud-based applications to the edge network,supporting computation-intensive services like face detection and natural language processing.A core feature of mobile edge computing,SEC improves user experience and device performance by offloading local activities to edge processors.In this framework,blockchain technology is utilized to ensure secure and trustworthy communication between edge devices and servers,protecting against potential security threats.Additionally,Deep Learning algorithms are employed to analyze resource availability and optimize computation offloading decisions dynamically.IoT applications that require significant resources can benefit from SEC,which has better coverage.Although access is constantly changing and network devices have heterogeneous resources,it is not easy to create consistent,dependable,and instantaneous communication between edge devices and their processors,specifically in 5G Heterogeneous Network(HN)situations.Thus,an Intelligent Management of Resources for Smart Edge Computing(IMRSEC)framework,which combines blockchain,edge computing,and Artificial Intelligence(AI)into 5G HNs,has been proposed in this paper.As a result,a unique dual schedule deep reinforcement learning(DS-DRL)technique has been developed,consisting of a rapid schedule learning process and a slow schedule learning process.The primary objective is to minimize overall unloading latency and system resource usage by optimizing computation offloading,resource allocation,and application caching.Simulation results demonstrate that the DS-DRL approach reduces task execution time by 32%,validating the method’s effectiveness within the IMRSEC framework.
基金the Deanship of Graduate Studies and Scientific Research at Qassim University for financial support(QU-APC-2025).
文摘The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.
基金This research was supported by the Ministry of Higher Education,Malaysia(MoHE)through Fundamental Research Grant Scheme(FRGS/1/2021/TK0/UTAR/02/9)The work was also supported by the Universiti Tunku Abdul Rahman(UTAR),Malaysia,under UTAR Research Fund(UTARRF)(IPSR/RMC/UTARRF/2021C1/T05).
文摘Nowadays,Multi Robotic System(MRS)consisting of different robot shapes,sizes and capabilities has received significant attention from researchers and are being deployed in a variety of real-world applications.From sensors and actuators improved by communication technologies to powerful computing systems utilizing advanced Artificial Intelligence(AI)algorithms have rapidly driven the development of MRS,so the Internet of Things(IoT)in MRS has become a new topic,namely the Internet of Robotic Things(IoRT).This paper summarizes a comprehensive survey of state-of-the-art technologies for mobile robots,including general architecture,benefits,challenges,practical applications,and future research directions.In addition,remarkable research of i)multirobot navigation,ii)network architecture,routing protocols and communications,and iii)coordination among robots as well as data analysis via external computing(cloud,fog,edge,edge-cloud)are merged with the IoRT architecture according to their applicability.Moreover,security is a long-term challenge for IoRT because of various attack vectors,security flaws,and vulnerabilities.Security threats,attacks,and existing solutions based on IoRT architectures are also under scrutiny.Moreover,the identification of environmental situations that are crucial for all types of IoRT applications,such as the detection of objects,human,and obstacles,is also critically reviewed.Finally,future research directions are given by analyzing the challenges of IoRT in mobile robots.
基金Taif University Researchers Supporting Project No.(TURSP-2020/126),Taif University,Taif,Saudi Arabia。
文摘The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.
基金We would also thank the support from the National Natural Science Foundation of China(Nos.62172182,62202118,61962009)the Top Technology Talent Project from Guizhou Education Department(Qian jiao ji[2022]073)The Opening Foundation of Key Laboratory of Intelligent Control Technology for Wuling-Mountain Ecological Agriculture in Hunan Province(Grant No.ZNKZN2021-07).
文摘Due to its decentralized,tamper-proof,and trust-free characteristics,blockchain is used in the Internet of Things(IoT)to guarantee the reliability of data.However,some technical flaws in blockchain itself prevent the development of these applications,such as the issue with linearly growing storage capacity of blockchain systems.On the other hand,there is a lack of storage resources for sensor devices in IoT,and numerous sensor devices will generate massive data at ultra-high speed,which makes the storage problem of the IoT enabled by blockchain more prominent.There are various solutions to reduce the storage burden by modifying the blockchain’s storage policy,but most of them do not consider the willingness of peers.In attempt to make the blockchain more compatible with the IoT,this paper proposes a storage optimization scheme that revisits the system data storage problem from amore practically oriented standpoint.Peers will only store transactional data that they are directly involved in.In addition,a transaction verification model is developed to enable peers to undertake transaction verification with the aid of cloud computing,and an incentive mechanism is premised on the storage optimization scheme to assure data integrity.The results of the simulation experiments demonstrate the proposed scheme’s advantage in terms of storage and throughput.
基金funded by the Abu Dhabi University,Faculty Research Incentive Grant(19300483–Adel Khelifi),United Arab Emirates.Link to Sponsor website:https://www.adu.ac.ae/research/research-at-adu/overview.
文摘Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.
基金supported by National Natural Science Foundation of China(Grant No.62071377,62101442,62201456)Natural Science Foundation of Shaanxi Province(Grant No.2023-YBGY-036,2022JQ-687)The Graduate Student Innovation Foundation Project of Xi’an University of Posts and Telecommunications under Grant CXJJDL2022003.
文摘The Internet of Medical Things(Io MT) is regarded as a critical technology for intelligent healthcare in the foreseeable 6G era. Nevertheless, due to the limited computing power capability of edge devices and task-related coupling relationships, Io MT faces unprecedented challenges. Considering the associative connections among tasks, this paper proposes a computing offloading policy for multiple-user devices(UDs) considering device-to-device(D2D) communication and a multi-access edge computing(MEC)technique under the scenario of Io MT. Specifically,to minimize the total delay and energy consumption concerning the requirement of Io MT, we first analyze and model the detailed local execution, MEC execution, D2D execution, and associated tasks offloading exchange model. Consequently, the associated tasks’ offloading scheme of multi-UDs is formulated as a mixed-integer nonconvex optimization problem. Considering the advantages of deep reinforcement learning(DRL) in processing tasks related to coupling relationships, a Double DQN based associative tasks computing offloading(DDATO) algorithm is then proposed to obtain the optimal solution, which can make the best offloading decision under the condition that tasks of UDs are associative. Furthermore, to reduce the complexity of the DDATO algorithm, the cacheaided procedure is intentionally introduced before the data training process. This avoids redundant offloading and computing procedures concerning tasks that previously have already been cached by other UDs. In addition, we use a dynamic ε-greedy strategy in the action selection section of the algorithm, thus preventing the algorithm from falling into a locally optimal solution. Simulation results demonstrate that compared with other existing methods for associative task models concerning different structures in the Io MT network, the proposed algorithm can lower the total cost more effectively and efficiently while also providing a tradeoff between delay and energy consumption tolerance.
基金This work was supported in part by the Zhejiang Lab under Grant 20210AB02in part by the Sichuan International Science and Technology Innovation Cooperation/Hong Kong,Macao and Taiwan Science and Technology Innovation Cooperation Project under Grant 2019YFH0163in part by the Key Research and Development Project of Sichuan Provincial Department of Science and Technology under Grant 2018JZ0071.
文摘In order to solve the high latency of traditional cloud computing and the processing capacity limitation of Internet of Things(IoT)users,Multi-access Edge Computing(MEC)migrates computing and storage capabilities from the remote data center to the edge of network,providing users with computation services quickly and directly.In this paper,we investigate the impact of the randomness caused by the movement of the IoT user on decision-making for offloading,where the connection between the IoT user and the MEC servers is uncertain.This uncertainty would be the main obstacle to assign the task accurately.Consequently,if the assigned task cannot match well with the real connection time,a migration(connection time is not enough to process)would be caused.In order to address the impact of this uncertainty,we formulate the offloading decision as an optimization problem considering the transmission,computation and migration.With the help of Stochastic Programming(SP),we use the posteriori recourse to compensate for inaccurate predictions.Meanwhile,in heterogeneous networks,considering multiple candidate MEC servers could be selected simultaneously due to overlapping,we also introduce the Multi-Arm Bandit(MAB)theory for MEC selection.The extensive simulations validate the improvement and effectiveness of the proposed SP-based Multi-arm bandit Method(SMM)for offloading in terms of reward,cost,energy consumption and delay.The results showthat SMMcan achieve about 20%improvement compared with the traditional offloading method that does not consider the randomness,and it also outperforms the existing SP/MAB based method for offloading.
基金Supported by the National Key Research and Development Program of China(No.2020YFC1807903)the Natural Science Foundation of Beijing Municipality(No.L192002)。
文摘With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is integrated into IoT.However, how to efficiently use edge computing resources to process the computing tasks of blockchain from IoT devices has not been fully studied.In this paper, the MEC and blockchain-enhanced IoT is considered.The transactions recording the data or other application information are generated by the IoT devices, and they are offloaded to the MEC servers to join the blockchain.The practical Byzantine fault tolerance(PBFT) consensus mechanism is used among all the MEC servers which are also the blockchain nodes, and the latency of the consensus process is modeled with the consideration of characteristics of the wireless network.The joint optimization problem of serving base station(BS) selection and wireless transmission resources allocation is modeled as a Markov decision process(MDP), and the long-term system utility is defined based on task reward, credit value, the latency of infrastructure layer and blockchain layer, and computing cost.A double deep Q learning(DQN) based transactions offloading algorithm(DDQN-TOA) is proposed, and simulation results show the advantages of the proposed algorithm in comparison to other methods.
基金supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government (MSIT) (No.2021R1C1C1013133)supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP)grant funded by the Korea Government (MSIT) (RS-2022-00167197,Development of Intelligent 5G/6G Infrastructure Technology for The Smart City)supported by the Soonchunhyang University Research Fund.
文摘In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to improve IIoT service efficiency.There are two types of costs for this kind of IoT network:a communication cost and a computing cost.For service efficiency,the communication cost of data transmission should be minimized,and the computing cost in the edge cloud should be also minimized.Therefore,in this paper,the communication cost for data transmission is defined as the delay factor,and the computing cost in the edge cloud is defined as the waiting time of the computing intensity.The proposed method selects an edge cloud that minimizes the total cost of the communication and computing costs.That is,a device chooses a routing path to the selected edge cloud based on the costs.The proposed method controls the data flows in a mesh-structured network and appropriately distributes the data processing load.The performance of the proposed method is validated through extensive computer simulation.When the transition probability from good to bad is 0.3 and the transition probability from bad to good is 0.7 in wireless and edge cloud states,the proposed method reduced both the average delay and the service pause counts to about 25%of the existing method.
基金supported by the Natural Science Foundation of China (No.62171051)。
文摘Puncturing has been recognized as a promising technology to cope with the coexistence problem of enhanced mobile broadband(eMBB) and ultra-reliable low latency communications(URLLC)traffic. However, the steady performance of eMBB traffic while meeting the requirements of URLLC traffic with puncturing is a major challenge in some realistic scenarios. In this paper, we pay attention to the timely and energy-efficient processing for eMBB traffic in the industrial Internet of Things(IIoT), where mobile edge computing(MEC) is employed for data processing. Specifically, the performance of eMBB traffic and URLLC traffic in a MEC-based IIoT system is ensured by setting the threshold of tolerable delay and outage probability, respectively. Furthermore,considering the limited energy supply, an energy minimization problem of eMBB device is formulated under the above constraints, by jointly optimizing the resource blocks(RBs) punctured by URLLC traffic, data offloading and transmit power of eMBB device. With Markov's inequality, the problem is reformulated by transforming the probabilistic outage constraint into a deterministic constraint. Meanwhile, an iterative energy minimization algorithm(IEMA) is proposed.Simulation results demonstrate that our algorithm has a significant reduction in the energy consumption for eMBB device and achieves a better overall effect compared to several benchmarks.
文摘The Internet of Things (IoT) is emerging as an attractive paradigm involving physical perceptions, cyber interactions, social correlations and even cognitive thinking through a cyber-physical-social-thinking hyperspace. In this context, energy management with the purposes of energy saving and high efficiency is a challenging issue. In this work, a taxonomy model is established in reference to the IoT layers (i.e., sensor-actuator layer, network layer, and application layer), and IoT energy management is addressed from the perspectives of supply and demand to achieve green perception, communication, and computing. A smart home scenario is presented as a case study involving the main enabling technologies with supply-side, demand-side, and supply-demand balance considerations, and open issues in the field of IoT energy management are also discussed.
文摘The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources to train and deploy an entire learning model.At the same time,transmitting continuous real-time data to a central server with high computing resource incurs enormous communication costs and raises issues in data security and privacy.Federated learning,a distributed machine learning framework,is a promising solution to train machine learning models with resource-limited devices and edge servers.Yet,the majority of existing works assume an impractically synchronous parameter update manner with homogeneous IoT nodes under stable communication connections.In this paper,we develop an asynchronous federated learning scheme to improve training efficiency for heterogeneous IoT devices under unstable communication network.Particularly,we formulate an asynchronous federated learning model and develop a lightweight node selection algorithm to carry out learning tasks effectively.The proposed algorithm iteratively selects heterogeneous IoT nodes to participate in the global learning aggregation while considering their local computing resource and communication condition.Extensive experimental results demonstrate that our proposed asynchronous federated learning scheme outperforms the state-of-the-art schemes in various settings on independent and identically distributed(i.i.d.)and non-i.i.d.data distribution.
文摘Blockchain technology has become a research hotspot in recent years with the prominent characteristics as public,distributed and decentration.And blockchain-enabled internet of things(BIoT)has a tendency to make a revolutionary change for the internet of things(IoT)which requires distributed trustless consensus.However,the scalability and security issues become particularly important with the dramatically increasing number of IoT devices.Especially,with the development of quantum computing,many extant cryptographic algorithms applied in blockchain or BIoT systems are vulnerable to the quantum attacks.In this paper,an anti-quantum proxy blind signature scheme based on the lattice cryptography has been proposed,which can provide user anonymity and untraceability in the distributed applications of BIoT.Then,the security proof of the proposed scheme can derive that it is secure in random oracle model,and the efficiency analysis can indicate it is efficient than other similar literatures.