The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security mod...The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.展开更多
The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources...The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources to train and deploy an entire learning model.At the same time,transmitting continuous real-time data to a central server with high computing resource incurs enormous communication costs and raises issues in data security and privacy.Federated learning,a distributed machine learning framework,is a promising solution to train machine learning models with resource-limited devices and edge servers.Yet,the majority of existing works assume an impractically synchronous parameter update manner with homogeneous IoT nodes under stable communication connections.In this paper,we develop an asynchronous federated learning scheme to improve training efficiency for heterogeneous IoT devices under unstable communication network.Particularly,we formulate an asynchronous federated learning model and develop a lightweight node selection algorithm to carry out learning tasks effectively.The proposed algorithm iteratively selects heterogeneous IoT nodes to participate in the global learning aggregation while considering their local computing resource and communication condition.Extensive experimental results demonstrate that our proposed asynchronous federated learning scheme outperforms the state-of-the-art schemes in various settings on independent and identically distributed(i.i.d.)and non-i.i.d.data distribution.展开更多
Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physi...Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.展开更多
Peer-to-peer computation offloading has been a promising approach that enables resourcelimited Internet of Things(IoT)devices to offload their computation-intensive tasks to idle peer devices in proximity.Different fr...Peer-to-peer computation offloading has been a promising approach that enables resourcelimited Internet of Things(IoT)devices to offload their computation-intensive tasks to idle peer devices in proximity.Different from dedicated servers,the spare computation resources offered by peer devices are random and intermittent,which affects the offloading performance.The mutual interference caused by multiple simultaneous offloading requestors that share the same wireless channel further complicates the offloading decisions.In this work,we investigate the opportunistic peer-to-peer task offloading problem by jointly considering the stochastic task arrivals,dynamic interuser interference,and opportunistic availability of peer devices.Each requestor makes decisions on both local computation frequency and offloading transmission power to minimize its own expected long-term cost on tasks completion,which takes into consideration its energy consumption,task delay,and task loss due to buffer overflow.The dynamic decision process among multiple requestors is formulated as a stochastic game.By constructing the post-decision states,a decentralized online offloading algorithm is proposed,where each requestor as an independent learning agent learns to approach the optimal strategies with its local observations.Simulation results under different system parameter configurations demonstrate the proposed online algorithm achieves a better performance compared with some existing algorithms,especially in the scenarios with large task arrival probability or small helper availability probability.展开更多
Recently,Internet of Things(IoT)have been applied widely and improved the quality of the daily life.However,the lightweight IoT devices can hardly implement complicated applications since they usually have limited com...Recently,Internet of Things(IoT)have been applied widely and improved the quality of the daily life.However,the lightweight IoT devices can hardly implement complicated applications since they usually have limited computing resource and just can execute some simple computation tasks.Moreover,data transmission and interaction in IoT is another crucial issue when the IoT devices are deployed at remote areas without manual operation.Mobile edge computing(MEC)and unmanned aerial vehicle(UAV)provide significant solutions to these problems.In addition,in order to ensure the security and privacy of data,blockchain has been attracted great attention from both academia and industry.Therefore,an UAV-assisted IoT system integrated with MEC and blockchain is pro-posed.The optimization problem in the proposed architecture is formulated to achieve the optimal trade-off between energy consumption and computation latency through jointly considering computa-tion offloading decision,spectrum resource allocation and computing resource allocation.Consider-ing this complicated optimization problem,the non-convex mixed integer problem can be transformed into a convex problem,and a distributed algorithm based on alternating direction multiplier method(ADMM)is proposed.Simulation results demonstrate the validity of this scheme.展开更多
The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machin...The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.展开更多
Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge ...Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge of things are the spine of all real-time and scalable applications.Conspicuously,this study proposed a novel framework for a real-time and scalable application that changes dynamically with time.In this study,IoT deployment is recommended for data acquisition.The Pre-Processing of data with local edge and fog nodes is implemented in this study.The thresholdoriented data classification method is deployed to improve the intrusion detection mechanism’s performance.The employment of machine learningempowered intelligent algorithms in a distributed manner is implemented to enhance the overall response rate of the layered framework.The placement of respondent nodes near the framework’s IoT layer minimizes the network’s latency.For economic evaluation of the proposed framework with minimal efforts,EdgeCloudSim and FogNetSim++simulation environments are deployed in this study.The experimental results confirm the robustness of the proposed system by its improvised threshold-oriented data classification and intrusion detection approach,improved response rate,and prediction mechanism.Moreover,the proposed layered framework provides a robust solution for real-time and scalable applications that changes dynamically with time.展开更多
With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is ...With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is integrated into IoT.However, how to efficiently use edge computing resources to process the computing tasks of blockchain from IoT devices has not been fully studied.In this paper, the MEC and blockchain-enhanced IoT is considered.The transactions recording the data or other application information are generated by the IoT devices, and they are offloaded to the MEC servers to join the blockchain.The practical Byzantine fault tolerance(PBFT) consensus mechanism is used among all the MEC servers which are also the blockchain nodes, and the latency of the consensus process is modeled with the consideration of characteristics of the wireless network.The joint optimization problem of serving base station(BS) selection and wireless transmission resources allocation is modeled as a Markov decision process(MDP), and the long-term system utility is defined based on task reward, credit value, the latency of infrastructure layer and blockchain layer, and computing cost.A double deep Q learning(DQN) based transactions offloading algorithm(DDQN-TOA) is proposed, and simulation results show the advantages of the proposed algorithm in comparison to other methods.展开更多
Reliable communication and intensive computing power cannot be provided effectively by temporary hot spots in disaster areas and complex terrain ground infrastructure.Mitigating this has greatly developed the applicat...Reliable communication and intensive computing power cannot be provided effectively by temporary hot spots in disaster areas and complex terrain ground infrastructure.Mitigating this has greatly developed the application and integration of UAV and Mobile Edge Computing(MEC)to the Internet of Things(loT).However,problems such as multi-user and huge data flow in large areas,which contradict the reality that a single UAV is constrained by limited computing power,still exist.Due to allowing UAV collaboration to accomplish complex tasks,cooperative task offloading between multiple UAVs must meet the interdependence of tasks and realize parallel processing,which reduces the computing power consumption and endurance pressure of terminals.Considering the computing requirements of the user terminal,delay constraint of a computing task,energy constraint,and safe distance of UAV,we constructed a UAV-Assisted cooperative offloading energy efficiency system for mobile edge computing to minimize user terminal energy consumption.However,the resulting optimization problem is originally nonconvex and thus,difficult to solve optimally.To tackle this problem,we developed an energy efficiency optimization algorithm using Block Coordinate Descent(BCD)that decomposes the problem into three convex subproblems.Furthermore,we jointly optimized the number of local computing tasks,number of computing offloaded tasks,trajectories of UAV,and offloading matching relationship between multi-UAVs and multiuser terminals.Simulation results show that the proposed approach is suitable for different channel conditions and significantly saves the user terminal energy consumption compared with other benchmark schemes.展开更多
The lack of modern technology in healthcare has led to the death of thousands of lives worldwide due to COVID-19 since its outbreak.The Internet of Things(IoT)along with other technologies like Machine Learning can re...The lack of modern technology in healthcare has led to the death of thousands of lives worldwide due to COVID-19 since its outbreak.The Internet of Things(IoT)along with other technologies like Machine Learning can revolutionize the traditional healthcare system.Instead of reactive healthcare systems,IoT technology combined with machine learning and edge computing can deliver proactive and preventive healthcare services.In this study,a novel healthcare edge-assisted framework has been proposed to detect and prognosticate the COVID-19 suspects in the initial phases to stop the transmission of coronavirus infection.The proposed framework is based on edge computing to provide personalized healthcare facilities with minimal latency,short response time,and optimal energy consumption.In this paper,the COVID-19 primary novel dataset has been used for experimental purposes employing various classification-based machine learning models.The proposed models were validated using kcross-validation to ensure the consistency of models.Based on the experimental results,our proposed models have recorded good accuracies with highest of 97.767%by Support Vector Machine.According to the findings of experiments,the proposed conceptual model will aid in the early detection and prediction of COVID-19 suspects,as well as continuous monitoring of the patient in order to provide emergency care in case of medical volatile situation.展开更多
Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing i...Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing is not suitable for processing the data due to limited computing power and transmission load.Mobile Edge Computing(MEC)has been proposed to solve these problems.Because of limited computation ability and battery capacity,tasks can be executed in the MEC server.However,how to schedule those tasks becomes a challenge,and is the main topic of this piece.In this paper,we design an efficient intelligent algorithm to jointly optimize energy cost and computing resource allocation in MEC.In view of the advantages of deep learning,we propose a Deep Learning-Based Traffic Scheduling Approach(DLTSA).We translate the scheduling problem into a classification problem.Evaluation demonstrates that our DLTSA approach can reduce energy cost and have better performance compared to traditional scheduling algorithms.展开更多
The Internet of Things(IoT)has revolutionized how we interact with and gather data from our surrounding environment.IoT devices with various sensors and actuators generate vast amounts of data that can be harnessed to...The Internet of Things(IoT)has revolutionized how we interact with and gather data from our surrounding environment.IoT devices with various sensors and actuators generate vast amounts of data that can be harnessed to derive valuable insights.The rapid proliferation of Internet of Things(IoT)devices has ushered in an era of unprecedented data generation and connectivity.These IoT devices,equipped with many sensors and actuators,continuously produce vast volumes of data.However,the conventional approach of transmitting all this data to centralized cloud infrastructures for processing and analysis poses significant challenges.However,transmitting all this data to a centralized cloud infrastructure for processing and analysis can be inefficient and impractical due to bandwidth limitations,network latency,and scalability issues.This paper proposed a Self-Learning Internet Traffic Fuzzy Classifier(SLItFC)for traffic data analysis.The proposed techniques effectively utilize clustering and classification procedures to improve classification accuracy in analyzing network traffic data.SLItFC addresses the intricate task of efficiently managing and analyzing IoT data traffic at the edge.It employs a sophisticated combination of fuzzy clustering and self-learning techniques,allowing it to adapt and improve its classification accuracy over time.This adaptability is a crucial feature,given the dynamic nature of IoT environments where data patterns and traffic characteristics can evolve rapidly.With the implementation of the fuzzy classifier,the accuracy of the clustering process is improvised with the reduction of the computational time.SLItFC can reduce computational time while maintaining high classification accuracy.This efficiency is paramount in edge computing,where resource constraints demand streamlined data processing.Additionally,SLItFC’s performance advantages make it a compelling choice for organizations seeking to harness the potential of IoT data for real-time insights and decision-making.With the Self-Learning process,the SLItFC model monitors the network traffic data acquired from the IoT Devices.The Sugeno fuzzy model is implemented within the edge computing environment for improved classification accuracy.Simulation analysis stated that the proposed SLItFC achieves 94.5%classification accuracy with reduced classification time.展开更多
As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent task...As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent tasks.By exploiting its low latency and high bandwidth,mobile edge computing(MEC)has emerged to achieve the high-performance computation offloading of these applications to satisfy the quality-of-service requirements of workflows and devices.In this study,we propose an offloading strategy for IoT-based workflows in a high-performance MEC environment.The proposed task-based offloading strategy consists of an optimization problem that includes task dependency,communication costs,workflow constraints,device energy consumption,and the heterogeneous characteristics of the edge environment.In addition,the optimal placement of workflow tasks is optimized using a discrete teaching learning-based optimization(DTLBO)metaheuristic.Extensive experimental evaluations demonstrate that the proposed offloading strategy is effective at minimizing the energy consumption of mobile devices and reducing the execution times of workflows compared to offloading strategies using different metaheuristics,including particle swarm optimization and ant colony optimization.展开更多
Internet of Things (IoT) is ubiquitous, including objects or devices communicating through heterogenous wireless networks. One of the major challenges in mobile IoT is an efficient vertical handover decision (VHD) tec...Internet of Things (IoT) is ubiquitous, including objects or devices communicating through heterogenous wireless networks. One of the major challenges in mobile IoT is an efficient vertical handover decision (VHD) technique between heterogenous networks for seamless connectivity with constrained resources. The conventional VHD approach is mainly based on received signal strength (RSS). The approach is inefficient for vertical handover, since it always selects the target network with the strongest signal without taking into consideration of factors such as quality of service (QoS), cost, delay, etc. In this paper, we present a hybrid approach by integrating the multi-cri- teria based VHD (MCVHD) technique and an algorithm based on fuzzy logic for efficient VHD among Wi-Fi, Radio and Satellite networks. The MCVHD provides a lightweight solution that aims to achieving seamless connectivity for mobile IoT Edge Gateway over a set of heterogeneous networks. The proposed solution is evaluated in real time using a testbed containing real IoT devices. Further, the testbed is integrated with lightweight and efficient software techniques, e.g., microservices, containers, broker, and Edge/Cloud techniques. The experimental results show that the proposed approach is suitable for an IoT environment and it outperforms the conventional RSS Quality based VHD by minimizing handover failures, unnecessary handovers, handover time and cost of service.展开更多
针对大规模分布式蓄电池管理在实时监控、数据融合及多厂商兼容等方面的挑战,引入基于窄带物联网(Narrow Band Internet of Things,NB-IoT)技术的解决方案。通过构建低功耗NB-IoT监控网络、边缘智能管控机制及标准化接入平台,实现对大...针对大规模分布式蓄电池管理在实时监控、数据融合及多厂商兼容等方面的挑战,引入基于窄带物联网(Narrow Band Internet of Things,NB-IoT)技术的解决方案。通过构建低功耗NB-IoT监控网络、边缘智能管控机制及标准化接入平台,实现对大规模异构蓄电池的高效接入、状态感知与统一管理。在数据中心和智能电网等场景的应用结果表明,NB-IoT方案显著提升了数据传输效率、监控响应速度以及故障预警能力,为分布式储能设施的智能化运维提供了有力支撑。展开更多
The rapid expansion of the Internet of Things (IoT) has driven the need for advanced computational frameworks capable of handling the complex data processing and security challenges that modern IoT applications demand...The rapid expansion of the Internet of Things (IoT) has driven the need for advanced computational frameworks capable of handling the complex data processing and security challenges that modern IoT applications demand. However, traditional cloud computing frameworks face significant latency, scalability, and security issues. Quantum-Edge Cloud Computing (QECC) offers an innovative solution by integrating the computational power of quantum computing with the low-latency advantages of edge computing and the scalability of cloud computing resources. This study is grounded in an extensive literature review, performance improvements, and metrics data from Bangladesh, focusing on smart city infrastructure, healthcare monitoring, and the industrial IoT sector. The discussion covers vital elements, including integrating quantum cryptography to enhance data security, the critical role of edge computing in reducing response times, and cloud computing’s ability to support large-scale IoT networks with its extensive resources. Through case studies such as the application of quantum sensors in autonomous vehicles, the practical impact of QECC is demonstrated. Additionally, the paper outlines future research opportunities, including developing quantum-resistant encryption techniques and optimizing quantum algorithms for edge computing. The convergence of these technologies in QECC has the potential to overcome the current limitations of IoT frameworks, setting a new standard for future IoT applications.展开更多
Metaverse,envisioned as the next evolution of the Internet,is expected to evolve into an innovative medium advancing information civilization.Its core characteristics,including ubiquity,seamlessness,immersion,interope...Metaverse,envisioned as the next evolution of the Internet,is expected to evolve into an innovative medium advancing information civilization.Its core characteristics,including ubiquity,seamlessness,immersion,interoperability and metaspatiotemporality,are catalyzing the development of multiple technologies and fostering a convergence between the physical and virtual worlds.Despite its potential,the critical concept of symbiosis,which involves the synchronous generation and management of virtuality from reality and serves as the cornerstone of this convergence,is often overlooked.Additionally,cumbersome service designs,stemming from the intricate interplay of various technologies and inefficient resource utilization,are impeding an ideal Metaverse ecosystem.To address these challenges,we propose a bi-model Parallel Symbiotic Metaverse(PSM)system,engineered with a Cybertwin-enabled 6G framework where Cybertwins mirror Sensing Devices(SDs)and serve a bridging role as autonomous agents.Based on this framework,the system is structured into two models.In the queue model,SDs capture environmental data that Cybertwins then coordinate and schedule.In the service model,Cybertwins manage service requests and collaborate with SDs to make responsive decisions.We incorporate two algorithms to address resource scheduling and virtual service responses,showcasing the synergistic role of Cybertwins.Moreover,our PSM system advocates for the participation of SDs from collaborators,enhancing performance while reducing operational costs for Virtual Service Operator(VSO).Finally,we comparatively analyze the efficiency and complexity of the proposed algorithms,and demonstrate the efficacy of the PSM system across multiple performance indicators.The results indicate our system can be deployed cost-effectively with Cybertwin-enabled 6G.展开更多
基金the Deanship of Graduate Studies and Scientific Research at Qassim University for financial support(QU-APC-2025).
文摘The rapid expansion of the Internet of Things(IoT)has introduced significant security challenges due to the scale,complexity,and heterogeneity of interconnected devices.The current traditional centralized security models are deemed irrelevant in dealing with these threats,especially in decentralized applications where the IoT devices may at times operate on minimal resources.The emergence of new technologies,including Artificial Intelligence(AI),blockchain,edge computing,and Zero-Trust-Architecture(ZTA),is offering potential solutions as it helps with additional threat detection,data integrity,and system resilience in real-time.AI offers sophisticated anomaly detection and prediction analytics,and blockchain delivers decentralized and tamper-proof insurance over device communication and exchange of information.Edge computing enables low-latency character processing by distributing and moving the computational workload near the devices.The ZTA enhances security by continuously verifying each device and user on the network,adhering to the“never trust,always verify”ideology.The present research paper is a review of these technologies,finding out how they are used in securing IoT ecosystems,the issues of such integration,and the possibility of developing a multi-layered,adaptive security structure.Major concerns,such as scalability,resource limitations,and interoperability,are identified,and the way to optimize the application of AI,blockchain,and edge computing in zero-trust IoT systems in the future is discussed.
文摘The advancement of the Internet of Things(IoT)brings new opportunities for collecting real-time data and deploying machine learning models.Nonetheless,an individual IoT device may not have adequate computing resources to train and deploy an entire learning model.At the same time,transmitting continuous real-time data to a central server with high computing resource incurs enormous communication costs and raises issues in data security and privacy.Federated learning,a distributed machine learning framework,is a promising solution to train machine learning models with resource-limited devices and edge servers.Yet,the majority of existing works assume an impractically synchronous parameter update manner with homogeneous IoT nodes under stable communication connections.In this paper,we develop an asynchronous federated learning scheme to improve training efficiency for heterogeneous IoT devices under unstable communication network.Particularly,we formulate an asynchronous federated learning model and develop a lightweight node selection algorithm to carry out learning tasks effectively.The proposed algorithm iteratively selects heterogeneous IoT nodes to participate in the global learning aggregation while considering their local computing resource and communication condition.Extensive experimental results demonstrate that our proposed asynchronous federated learning scheme outperforms the state-of-the-art schemes in various settings on independent and identically distributed(i.i.d.)and non-i.i.d.data distribution.
基金This research has been supported by the National Science Foundation(under grant#1723596)the National Security Agency(under grant#H98230-17-1-0355).
文摘Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.
基金supported by National Natural Science Foundation of China (No. 62101601)
文摘Peer-to-peer computation offloading has been a promising approach that enables resourcelimited Internet of Things(IoT)devices to offload their computation-intensive tasks to idle peer devices in proximity.Different from dedicated servers,the spare computation resources offered by peer devices are random and intermittent,which affects the offloading performance.The mutual interference caused by multiple simultaneous offloading requestors that share the same wireless channel further complicates the offloading decisions.In this work,we investigate the opportunistic peer-to-peer task offloading problem by jointly considering the stochastic task arrivals,dynamic interuser interference,and opportunistic availability of peer devices.Each requestor makes decisions on both local computation frequency and offloading transmission power to minimize its own expected long-term cost on tasks completion,which takes into consideration its energy consumption,task delay,and task loss due to buffer overflow.The dynamic decision process among multiple requestors is formulated as a stochastic game.By constructing the post-decision states,a decentralized online offloading algorithm is proposed,where each requestor as an independent learning agent learns to approach the optimal strategies with its local observations.Simulation results under different system parameter configurations demonstrate the proposed online algorithm achieves a better performance compared with some existing algorithms,especially in the scenarios with large task arrival probability or small helper availability probability.
基金Supported by the National Natural Science Foundation of China(No.61901011,61901067)the Foundation of Beijing Municipal Commission of Education(No.KM202110005021,KM202010005017)the Beijing Natural Science Foundation(No.L211002).
文摘Recently,Internet of Things(IoT)have been applied widely and improved the quality of the daily life.However,the lightweight IoT devices can hardly implement complicated applications since they usually have limited computing resource and just can execute some simple computation tasks.Moreover,data transmission and interaction in IoT is another crucial issue when the IoT devices are deployed at remote areas without manual operation.Mobile edge computing(MEC)and unmanned aerial vehicle(UAV)provide significant solutions to these problems.In addition,in order to ensure the security and privacy of data,blockchain has been attracted great attention from both academia and industry.Therefore,an UAV-assisted IoT system integrated with MEC and blockchain is pro-posed.The optimization problem in the proposed architecture is formulated to achieve the optimal trade-off between energy consumption and computation latency through jointly considering computa-tion offloading decision,spectrum resource allocation and computing resource allocation.Consider-ing this complicated optimization problem,the non-convex mixed integer problem can be transformed into a convex problem,and a distributed algorithm based on alternating direction multiplier method(ADMM)is proposed.Simulation results demonstrate the validity of this scheme.
文摘The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.
文摘Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge of things are the spine of all real-time and scalable applications.Conspicuously,this study proposed a novel framework for a real-time and scalable application that changes dynamically with time.In this study,IoT deployment is recommended for data acquisition.The Pre-Processing of data with local edge and fog nodes is implemented in this study.The thresholdoriented data classification method is deployed to improve the intrusion detection mechanism’s performance.The employment of machine learningempowered intelligent algorithms in a distributed manner is implemented to enhance the overall response rate of the layered framework.The placement of respondent nodes near the framework’s IoT layer minimizes the network’s latency.For economic evaluation of the proposed framework with minimal efforts,EdgeCloudSim and FogNetSim++simulation environments are deployed in this study.The experimental results confirm the robustness of the proposed system by its improvised threshold-oriented data classification and intrusion detection approach,improved response rate,and prediction mechanism.Moreover,the proposed layered framework provides a robust solution for real-time and scalable applications that changes dynamically with time.
基金Supported by the National Key Research and Development Program of China(No.2020YFC1807903)the Natural Science Foundation of Beijing Municipality(No.L192002)。
文摘With the increased emphasis on data security in the Internet of Things(IoT), blockchain has received more and more attention.Due to the computing consuming characteristics of blockchain, mobile edge computing(MEC) is integrated into IoT.However, how to efficiently use edge computing resources to process the computing tasks of blockchain from IoT devices has not been fully studied.In this paper, the MEC and blockchain-enhanced IoT is considered.The transactions recording the data or other application information are generated by the IoT devices, and they are offloaded to the MEC servers to join the blockchain.The practical Byzantine fault tolerance(PBFT) consensus mechanism is used among all the MEC servers which are also the blockchain nodes, and the latency of the consensus process is modeled with the consideration of characteristics of the wireless network.The joint optimization problem of serving base station(BS) selection and wireless transmission resources allocation is modeled as a Markov decision process(MDP), and the long-term system utility is defined based on task reward, credit value, the latency of infrastructure layer and blockchain layer, and computing cost.A double deep Q learning(DQN) based transactions offloading algorithm(DDQN-TOA) is proposed, and simulation results show the advantages of the proposed algorithm in comparison to other methods.
基金supported by the Jiangsu Provincial Key Research and Development Program(No.BE2020084-4)the National Natural Science Foundation of China(No.92067201)+2 种基金the National Natural Science Foundation of China(61871446)the Open Research Fund of Jiangsu Key Laboratory of Wireless Communications(710020017002)the Natural Science Foundation of Nanjing University of Posts and telecommunications(NY220047).
文摘Reliable communication and intensive computing power cannot be provided effectively by temporary hot spots in disaster areas and complex terrain ground infrastructure.Mitigating this has greatly developed the application and integration of UAV and Mobile Edge Computing(MEC)to the Internet of Things(loT).However,problems such as multi-user and huge data flow in large areas,which contradict the reality that a single UAV is constrained by limited computing power,still exist.Due to allowing UAV collaboration to accomplish complex tasks,cooperative task offloading between multiple UAVs must meet the interdependence of tasks and realize parallel processing,which reduces the computing power consumption and endurance pressure of terminals.Considering the computing requirements of the user terminal,delay constraint of a computing task,energy constraint,and safe distance of UAV,we constructed a UAV-Assisted cooperative offloading energy efficiency system for mobile edge computing to minimize user terminal energy consumption.However,the resulting optimization problem is originally nonconvex and thus,difficult to solve optimally.To tackle this problem,we developed an energy efficiency optimization algorithm using Block Coordinate Descent(BCD)that decomposes the problem into three convex subproblems.Furthermore,we jointly optimized the number of local computing tasks,number of computing offloaded tasks,trajectories of UAV,and offloading matching relationship between multi-UAVs and multiuser terminals.Simulation results show that the proposed approach is suitable for different channel conditions and significantly saves the user terminal energy consumption compared with other benchmark schemes.
基金The authors would like to thank the SKIMS(Sher-i-Kashmir Institute of Medical Sciences)for permitting us to collect the COVID-19 data from various departments.
文摘The lack of modern technology in healthcare has led to the death of thousands of lives worldwide due to COVID-19 since its outbreak.The Internet of Things(IoT)along with other technologies like Machine Learning can revolutionize the traditional healthcare system.Instead of reactive healthcare systems,IoT technology combined with machine learning and edge computing can deliver proactive and preventive healthcare services.In this study,a novel healthcare edge-assisted framework has been proposed to detect and prognosticate the COVID-19 suspects in the initial phases to stop the transmission of coronavirus infection.The proposed framework is based on edge computing to provide personalized healthcare facilities with minimal latency,short response time,and optimal energy consumption.In this paper,the COVID-19 primary novel dataset has been used for experimental purposes employing various classification-based machine learning models.The proposed models were validated using kcross-validation to ensure the consistency of models.Based on the experimental results,our proposed models have recorded good accuracies with highest of 97.767%by Support Vector Machine.According to the findings of experiments,the proposed conceptual model will aid in the early detection and prediction of COVID-19 suspects,as well as continuous monitoring of the patient in order to provide emergency care in case of medical volatile situation.
基金supported in part by the National Natural Science Foun-dation of China(61902029)R&D Program of Beijing Municipal Education Commission(No.KM202011232015)Project for Acceleration of University Classi cation Development(Nos.5112211036,5112211037,5112211038).
文摘Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing is not suitable for processing the data due to limited computing power and transmission load.Mobile Edge Computing(MEC)has been proposed to solve these problems.Because of limited computation ability and battery capacity,tasks can be executed in the MEC server.However,how to schedule those tasks becomes a challenge,and is the main topic of this piece.In this paper,we design an efficient intelligent algorithm to jointly optimize energy cost and computing resource allocation in MEC.In view of the advantages of deep learning,we propose a Deep Learning-Based Traffic Scheduling Approach(DLTSA).We translate the scheduling problem into a classification problem.Evaluation demonstrates that our DLTSA approach can reduce energy cost and have better performance compared to traditional scheduling algorithms.
基金This research is funded by 2023 Henan Province Science and Technology Research Projects:Key Technology of Rapid Urban Flood Forecasting Based onWater Level Feature Analysis and Spatio-Temporal Deep Learning(No.232102320015)Henan Provincial Higher Education Key Research Project Program(Project No.23B520024)a Multi-Sensor-Based Indoor Environmental Parameters Monitoring and Control System.
文摘The Internet of Things(IoT)has revolutionized how we interact with and gather data from our surrounding environment.IoT devices with various sensors and actuators generate vast amounts of data that can be harnessed to derive valuable insights.The rapid proliferation of Internet of Things(IoT)devices has ushered in an era of unprecedented data generation and connectivity.These IoT devices,equipped with many sensors and actuators,continuously produce vast volumes of data.However,the conventional approach of transmitting all this data to centralized cloud infrastructures for processing and analysis poses significant challenges.However,transmitting all this data to a centralized cloud infrastructure for processing and analysis can be inefficient and impractical due to bandwidth limitations,network latency,and scalability issues.This paper proposed a Self-Learning Internet Traffic Fuzzy Classifier(SLItFC)for traffic data analysis.The proposed techniques effectively utilize clustering and classification procedures to improve classification accuracy in analyzing network traffic data.SLItFC addresses the intricate task of efficiently managing and analyzing IoT data traffic at the edge.It employs a sophisticated combination of fuzzy clustering and self-learning techniques,allowing it to adapt and improve its classification accuracy over time.This adaptability is a crucial feature,given the dynamic nature of IoT environments where data patterns and traffic characteristics can evolve rapidly.With the implementation of the fuzzy classifier,the accuracy of the clustering process is improvised with the reduction of the computational time.SLItFC can reduce computational time while maintaining high classification accuracy.This efficiency is paramount in edge computing,where resource constraints demand streamlined data processing.Additionally,SLItFC’s performance advantages make it a compelling choice for organizations seeking to harness the potential of IoT data for real-time insights and decision-making.With the Self-Learning process,the SLItFC model monitors the network traffic data acquired from the IoT Devices.The Sugeno fuzzy model is implemented within the edge computing environment for improved classification accuracy.Simulation analysis stated that the proposed SLItFC achieves 94.5%classification accuracy with reduced classification time.
文摘As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent tasks.By exploiting its low latency and high bandwidth,mobile edge computing(MEC)has emerged to achieve the high-performance computation offloading of these applications to satisfy the quality-of-service requirements of workflows and devices.In this study,we propose an offloading strategy for IoT-based workflows in a high-performance MEC environment.The proposed task-based offloading strategy consists of an optimization problem that includes task dependency,communication costs,workflow constraints,device energy consumption,and the heterogeneous characteristics of the edge environment.In addition,the optimal placement of workflow tasks is optimized using a discrete teaching learning-based optimization(DTLBO)metaheuristic.Extensive experimental evaluations demonstrate that the proposed offloading strategy is effective at minimizing the energy consumption of mobile devices and reducing the execution times of workflows compared to offloading strategies using different metaheuristics,including particle swarm optimization and ant colony optimization.
文摘Internet of Things (IoT) is ubiquitous, including objects or devices communicating through heterogenous wireless networks. One of the major challenges in mobile IoT is an efficient vertical handover decision (VHD) technique between heterogenous networks for seamless connectivity with constrained resources. The conventional VHD approach is mainly based on received signal strength (RSS). The approach is inefficient for vertical handover, since it always selects the target network with the strongest signal without taking into consideration of factors such as quality of service (QoS), cost, delay, etc. In this paper, we present a hybrid approach by integrating the multi-cri- teria based VHD (MCVHD) technique and an algorithm based on fuzzy logic for efficient VHD among Wi-Fi, Radio and Satellite networks. The MCVHD provides a lightweight solution that aims to achieving seamless connectivity for mobile IoT Edge Gateway over a set of heterogeneous networks. The proposed solution is evaluated in real time using a testbed containing real IoT devices. Further, the testbed is integrated with lightweight and efficient software techniques, e.g., microservices, containers, broker, and Edge/Cloud techniques. The experimental results show that the proposed approach is suitable for an IoT environment and it outperforms the conventional RSS Quality based VHD by minimizing handover failures, unnecessary handovers, handover time and cost of service.
文摘针对大规模分布式蓄电池管理在实时监控、数据融合及多厂商兼容等方面的挑战,引入基于窄带物联网(Narrow Band Internet of Things,NB-IoT)技术的解决方案。通过构建低功耗NB-IoT监控网络、边缘智能管控机制及标准化接入平台,实现对大规模异构蓄电池的高效接入、状态感知与统一管理。在数据中心和智能电网等场景的应用结果表明,NB-IoT方案显著提升了数据传输效率、监控响应速度以及故障预警能力,为分布式储能设施的智能化运维提供了有力支撑。
文摘The rapid expansion of the Internet of Things (IoT) has driven the need for advanced computational frameworks capable of handling the complex data processing and security challenges that modern IoT applications demand. However, traditional cloud computing frameworks face significant latency, scalability, and security issues. Quantum-Edge Cloud Computing (QECC) offers an innovative solution by integrating the computational power of quantum computing with the low-latency advantages of edge computing and the scalability of cloud computing resources. This study is grounded in an extensive literature review, performance improvements, and metrics data from Bangladesh, focusing on smart city infrastructure, healthcare monitoring, and the industrial IoT sector. The discussion covers vital elements, including integrating quantum cryptography to enhance data security, the critical role of edge computing in reducing response times, and cloud computing’s ability to support large-scale IoT networks with its extensive resources. Through case studies such as the application of quantum sensors in autonomous vehicles, the practical impact of QECC is demonstrated. Additionally, the paper outlines future research opportunities, including developing quantum-resistant encryption techniques and optimizing quantum algorithms for edge computing. The convergence of these technologies in QECC has the potential to overcome the current limitations of IoT frameworks, setting a new standard for future IoT applications.
基金partially supported by Beijing Advanced Innovation Center for Future Blockchain and Privacy Computingthe Fundamental Research Funds for the Central Universities(2024ZCJH07)。
文摘Metaverse,envisioned as the next evolution of the Internet,is expected to evolve into an innovative medium advancing information civilization.Its core characteristics,including ubiquity,seamlessness,immersion,interoperability and metaspatiotemporality,are catalyzing the development of multiple technologies and fostering a convergence between the physical and virtual worlds.Despite its potential,the critical concept of symbiosis,which involves the synchronous generation and management of virtuality from reality and serves as the cornerstone of this convergence,is often overlooked.Additionally,cumbersome service designs,stemming from the intricate interplay of various technologies and inefficient resource utilization,are impeding an ideal Metaverse ecosystem.To address these challenges,we propose a bi-model Parallel Symbiotic Metaverse(PSM)system,engineered with a Cybertwin-enabled 6G framework where Cybertwins mirror Sensing Devices(SDs)and serve a bridging role as autonomous agents.Based on this framework,the system is structured into two models.In the queue model,SDs capture environmental data that Cybertwins then coordinate and schedule.In the service model,Cybertwins manage service requests and collaborate with SDs to make responsive decisions.We incorporate two algorithms to address resource scheduling and virtual service responses,showcasing the synergistic role of Cybertwins.Moreover,our PSM system advocates for the participation of SDs from collaborators,enhancing performance while reducing operational costs for Virtual Service Operator(VSO).Finally,we comparatively analyze the efficiency and complexity of the proposed algorithms,and demonstrate the efficacy of the PSM system across multiple performance indicators.The results indicate our system can be deployed cost-effectively with Cybertwin-enabled 6G.