针对低轨卫星高动态场景下多服务质量(Quality of Service,QoS)时变物联业务资源调度问题,提出了一种基于Lyapunov优化的窄带物联网(Narrow Band Internet of Things,NB-IoT)多业务资源切片动态管理方法。该方法综合考虑多物联业务QoS...针对低轨卫星高动态场景下多服务质量(Quality of Service,QoS)时变物联业务资源调度问题,提出了一种基于Lyapunov优化的窄带物联网(Narrow Band Internet of Things,NB-IoT)多业务资源切片动态管理方法。该方法综合考虑多物联业务QoS需求、不同QoS业务队列状态以及切片大小的动态划分,构建了资源切片动态管理的资源调度优化问题。基于Lyapunov优化理论将非凸的多时隙动态资源切片划分问题转化为单时隙多QoS业务资源切片配置问题,从而在动态业务场景下实现资源切片与多QoS业务队列之间的动态适配。仿真结果表明,与传统NB-IoT上行资源调度方法相比,所提方法在低轨高动态场景下能够显著提升时延确定性业务的QoS保障和吞吐量。展开更多
为解决传统电平交叉模数转换器(LC ADC)精度较低和噪声整形逐次逼近寄存器(NS SAR)ADC功耗较大的问题,提出了一种应用于移动物联网(IoT)随机稀疏信号采集的LC-NS SAR ADC。在NS SAR ADC前端插入8 bit的LC ADC作为输入信号活跃度的预检...为解决传统电平交叉模数转换器(LC ADC)精度较低和噪声整形逐次逼近寄存器(NS SAR)ADC功耗较大的问题,提出了一种应用于移动物联网(IoT)随机稀疏信号采集的LC-NS SAR ADC。在NS SAR ADC前端插入8 bit的LC ADC作为输入信号活跃度的预检测电路,在电平交叉发生后开启NS SAR ADC的转换。二阶无源噪声整形电路积分过程只在事件触发后发生,从而能够根据输入信号的活跃度动态调节整体功耗。在1.8 V 180 nm CMOS工艺、采样率为40 kS/s、过采样率(OSR)为20、带宽为1 kHz下对该ADC进行仿真验证,结果表明信噪失真比(SNDR)达到87 dB,电路功耗为2.70μW,心电图信号输入时功耗仅为0.79μW,相较于传统等间隔奈奎斯特采样ADC,采样点减少了73%,在处理生物医学信号时实现了约5∶1的数据压缩比,Schreier品质因数(FoMs)和Walden品质因数(FoMw)分别为172.6 dB和67.0 fJ/conv.step。展开更多
针对智能反射面(Intelligent Reflecting Surface,IRS)辅助的含窃听者的认知物联网(Cognitive Internet of Things,C-IoT)通信系统,提出了一种基于联合波束成型的保密率优化方案。在系统模型中,考虑了一个由发射机、主用户、次用户、窃...针对智能反射面(Intelligent Reflecting Surface,IRS)辅助的含窃听者的认知物联网(Cognitive Internet of Things,C-IoT)通信系统,提出了一种基于联合波束成型的保密率优化方案。在系统模型中,考虑了一个由发射机、主用户、次用户、窃听者和智能反射面组成的多输入单输出通信场景。基于该模型,构建保密率优化问题,即在发射机总功率约束、主用户端干扰功率约束以及智能反射面单位模约束的条件下,通过联合优化主被动波束成型,最大化系统的保密率(Secrecy Rate,SR)。在实现过程中,由于公式化的问题非凸,因此使用交替优化的方法将原始问题分解为两个子问题进行优化,即发射机波束成型矩阵的优化以及IRS相移矩阵优化。针对发射机波束成型的矩阵优化,使用半定松弛法与逐次凸逼近法。接着,使用丁克尔巴赫法与逐次凸逼近的方法对IRS的相移矩阵进行优化。仿真结果表明,在含有窃听者的多输入单输出系统中,引入智能反射面实现主被动波束成型的优化有效提高了系统的保密率。展开更多
为提高铁路供配电系统故障检测的及时性和精准性,基于窄带物联网(Narrow Band Internet of Things,NB-IoT)通信技术,提出了铁路供配电系统故障数据识别技术,并构建了完整的数据采集、传输和识别模型系统。研究结果表明,该技术在故障识...为提高铁路供配电系统故障检测的及时性和精准性,基于窄带物联网(Narrow Band Internet of Things,NB-IoT)通信技术,提出了铁路供配电系统故障数据识别技术,并构建了完整的数据采集、传输和识别模型系统。研究结果表明,该技术在故障识别准确性和响应速度上表现突出,为复杂环境下铁路供配电系统的稳定运行提供了有效支持。展开更多
In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task schedul...In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.展开更多
Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resourc...Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resource bottlenecks and increased energy consumption.This study aims to address these limitations by proposing the Quantum Inspired Adaptive Resource Management(QIARM)model,which introduces novel algorithms inspired by quantum principles for enhanced resource allocation.QIARM employs a quantum superposition-inspired technique for multi-state resource representation and an adaptive learning component to adjust resources in real time dynamically.In addition,an energy-aware scheduling module minimizes power consumption by selecting optimal configurations based on energy metrics.The simulation was carried out in a 360-minute environment with eight distinct scenarios.This study introduces a novel quantum-inspired resource management framework that achieves up to 98%task offload success and reduces energy consumption by 20%,addressing critical challenges of scalability and efficiency in dynamic fog computing environments.展开更多
Modern industrial environments require uninterrupted machinery operation to maintain productivity standards while ensuring safety and minimizing costs.Conventional maintenance methods,such as reactive maintenance(i.e....Modern industrial environments require uninterrupted machinery operation to maintain productivity standards while ensuring safety and minimizing costs.Conventional maintenance methods,such as reactive maintenance(i.e.,run to failure)or time-based preventive maintenance(i.e.,scheduled servicing),prove ineffective for complex systems with many Internet of Things(IoT)devices and sensors because they fall short in detecting faults at early stages when it is most crucial.This paper presents a predictive maintenance framework based on a hybrid deep learning model that integrates the capabilities of Long Short-Term Memory(LSTM)Networks and Convolutional Neural Networks(CNNs).The framework integrates spatial feature extraction and temporal sequence modeling to accurately classify the health state of industrial equipment into three categories,including Normal,Require Maintenance,and Failed.The framework uses a modular pipeline that includes IoT-enabled data collection along with secure transmission methods to manage cloud storage and provide real-time fault classification.The FD004 subset of the NASA C-MAPSS dataset,containing multivariate sensor readings from aircraft engines,serves as the training and evaluation data for the model.Experimental results show that the LSTM-CNN model outperforms baseline models such as LSTM-SVM and LSTM-RNN,achieving an overall average accuracy of 86.66%,precision of 86.00%,recall of 86.33%,and F1-score of 86.33%.Contrary to the previous LSTM-CNN-based predictive maintenance models that either provide a binary classification or rely on synthetically balanced data,our paper provides a three-class maintenance state(i.e.,Normal,Require Maintenance,and Failed)along with threshold-based labeling that retains the true nature of the degradation.In addition,our work also provides an IoT-to-cloud-based modular architecture for deployment.It offers Computerized Maintenance Management System(CMMS)integration,making our proposed solution not only technically sound but also practical and innovative.The solution achieves real-world industrial deployment readiness through its reliable performance alongside its scalable system design.展开更多
Efficient resource management within Internet of Things(IoT)environments remains a pressing challenge due to the increasing number of devices and their diverse functionalities.This study introduces a neural network-ba...Efficient resource management within Internet of Things(IoT)environments remains a pressing challenge due to the increasing number of devices and their diverse functionalities.This study introduces a neural network-based model that uses Long-Short-Term Memory(LSTM)to optimize resource allocation under dynam-ically changing conditions.Designed to monitor the workload on individual IoT nodes,the model incorporates long-term data dependencies,enabling adaptive resource distribution in real time.The training process utilizes Min-Max normalization and grid search for hyperparameter tuning,ensuring high resource utilization and consistent performance.The simulation results demonstrate the effectiveness of the proposed method,outperforming the state-of-the-art approaches,including Dynamic and Efficient Enhanced Load-Balancing(DEELB),Optimized Scheduling and Collaborative Active Resource-management(OSCAR),Convolutional Neural Network with Monarch Butterfly Optimization(CNN-MBO),and Autonomic Workload Prediction and Resource Allocation for Fog(AWPR-FOG).For example,in scenarios with low system utilization,the model achieved a resource utilization efficiency of 95%while maintaining a latency of just 15 ms,significantly exceeding the performance of comparative methods.展开更多
The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)an...The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)and Deep Learning(DL)techniques have demonstrated promising early detection capabilities.However,their effectiveness is limited when handling the vast volumes of IoT-generated data due to scalability constraints,high computational costs,and the costly time-intensive process of data labeling.To address these challenges,this study proposes a Federated Learning(FL)framework that leverages collaborative and hybrid supervised learning to enhance cyber threat detection in IoT networks.By employing Deep Neural Networks(DNNs)and decentralized model training,the approach reduces computational complexity while improving detection accuracy.The proposed model demonstrates robust performance,achieving accuracies of 94.34%,99.95%,and 87.94%on the publicly available kitsune,Bot-IoT,and UNSW-NB15 datasets,respectively.Furthermore,its ability to detect zero-day attacks is validated through evaluations on two additional benchmark datasets,TON-IoT and IoT-23,using a Deep Federated Learning(DFL)framework,underscoring the generalization and effectiveness of the model in heterogeneous and decentralized IoT environments.Experimental results demonstrate superior performance over existing methods,establishing the proposed framework as an efficient and scalable solution for IoT security.展开更多
建筑信息模型(Building Information Modeling,BIM)、地理信息系统(Geographic Information System,GIS)与物联网(Internet of Things,IoT)技术的发展,为构建面向智慧城市的一体化运维管理平台提供了技术基础。文章围绕智慧城市治理需求...建筑信息模型(Building Information Modeling,BIM)、地理信息系统(Geographic Information System,GIS)与物联网(Internet of Things,IoT)技术的发展,为构建面向智慧城市的一体化运维管理平台提供了技术基础。文章围绕智慧城市治理需求,提出BIM-GIS-IoT三维融合的运维管理平台架构,重点研究多源异构数据集成与模型融合、IoT实时监测、三维可视化展示、设施健康诊断与预测性运维等关键技术,以提升城市基础设施的运维效率、风险预警能力与决策科学性,为智慧城市建设提供新的技术路径与管理模式。展开更多
The large-scale deployment of Internet of Things(IoT)technology across various aspects of daily life has significantly propelled the intelligent development of society.Among them,the integration of IoT and named data ...The large-scale deployment of Internet of Things(IoT)technology across various aspects of daily life has significantly propelled the intelligent development of society.Among them,the integration of IoT and named data networks(NDNs)reduces network complexity and provides practical directions for content-oriented network design.However,ensuring data integrity in NDN-IoT applications remains a challenging issue.Very recently,Wang et al.(Entropy,27(5),471(2025))designed a certificateless aggregate signature(CLAS)scheme for NDN-IoT environments.Wang et al.stated that their construction was provably secure under various types of security attacks.Using theoretical analysis methods,in this work,we reveal that their CLAS design fails to meet unforgeability,a core security requirement for CLAS schemes.In particular,we demonstrate that their scheme is vulnerable to amalicious public-key replacement attack,enabling an adversary to produce authentic signatures for arbitrary fraudulent messages.Therefore,Wang et al.’s design cannot achieve its goal.To address the issue,we systematically examine the root causes behind the vulnerability and propose a security-enhanced CLAS construction for NDN-IoT environments.We prove the security ofour improveddesignunder the standard security assumptionandalsoanalyze its practicalperformanceby comparing the computational and communication costs with several related works.The comparison results show the practicality of our design.展开更多
Ensuring an information fabric safe is critical and mandatory.For its related Internet of Things(IoT)service system running on the open Internet,existing host-based monitoring methods may fail due to only inspecting s...Ensuring an information fabric safe is critical and mandatory.For its related Internet of Things(IoT)service system running on the open Internet,existing host-based monitoring methods may fail due to only inspecting software,and the physical system may not be able to be protected.In this paper,a nonintrusive virtual machine(VM)-based runtime protection framework is provided to protect the physical system with the isolated IoT services as a controlling means.Compared with existing solutions,the framework gets inconsistent and untrusted observation knowledge from multiple observation sources,and enforces property policies concurrently and incrementally in a competing-game way to avoid compositional problems.In addition,the monitoring is implemented without any modification to the protected system.Experiments are conducted to validate the proposed techniques.展开更多
This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the fi...This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the first proposal to integrate IOTA’s Directed Acyclic Graph(DAG)-based identity framework into satellite IoT environments,enabling lightweight and distributed authentication under intermittent connectivity.The system leverages Decentralized Identifiers(DIDs)and Verifiable Credentials(VCs)over the Tangle,eliminating the need for mining and sequential blocks.An identity management workflow is implemented that supports the creation,validation,deactivation,and reactivation of IoT devices,and is experimentally validated on the Shimmer Testnet.Three metrics are defined and measured:resolution time,deactivation time,and reactivation time.To improve robustness,an algorithmic optimization is introduced that minimizes communication overhead and reduces latency during deactivation.The experimental results are compared with orbital simulations of satellite revisit times to assess operational feasibility.Unlike blockchain-based approaches,which typically suffer from high confirmation delays and scalability constraints,the proposed DAG architecture provides fast,cost-free operations suitable for resource-constrained IoT devices.The results show that authentication can be efficiently performed within satellite connectivity windows,positioning IOTA Identity as a viable solution for secure and scalable IoT authentication in LEO satellite networks.展开更多
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp...With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.展开更多
文摘针对低轨卫星高动态场景下多服务质量(Quality of Service,QoS)时变物联业务资源调度问题,提出了一种基于Lyapunov优化的窄带物联网(Narrow Band Internet of Things,NB-IoT)多业务资源切片动态管理方法。该方法综合考虑多物联业务QoS需求、不同QoS业务队列状态以及切片大小的动态划分,构建了资源切片动态管理的资源调度优化问题。基于Lyapunov优化理论将非凸的多时隙动态资源切片划分问题转化为单时隙多QoS业务资源切片配置问题,从而在动态业务场景下实现资源切片与多QoS业务队列之间的动态适配。仿真结果表明,与传统NB-IoT上行资源调度方法相比,所提方法在低轨高动态场景下能够显著提升时延确定性业务的QoS保障和吞吐量。
文摘为解决传统电平交叉模数转换器(LC ADC)精度较低和噪声整形逐次逼近寄存器(NS SAR)ADC功耗较大的问题,提出了一种应用于移动物联网(IoT)随机稀疏信号采集的LC-NS SAR ADC。在NS SAR ADC前端插入8 bit的LC ADC作为输入信号活跃度的预检测电路,在电平交叉发生后开启NS SAR ADC的转换。二阶无源噪声整形电路积分过程只在事件触发后发生,从而能够根据输入信号的活跃度动态调节整体功耗。在1.8 V 180 nm CMOS工艺、采样率为40 kS/s、过采样率(OSR)为20、带宽为1 kHz下对该ADC进行仿真验证,结果表明信噪失真比(SNDR)达到87 dB,电路功耗为2.70μW,心电图信号输入时功耗仅为0.79μW,相较于传统等间隔奈奎斯特采样ADC,采样点减少了73%,在处理生物医学信号时实现了约5∶1的数据压缩比,Schreier品质因数(FoMs)和Walden品质因数(FoMw)分别为172.6 dB和67.0 fJ/conv.step。
文摘针对智能反射面(Intelligent Reflecting Surface,IRS)辅助的含窃听者的认知物联网(Cognitive Internet of Things,C-IoT)通信系统,提出了一种基于联合波束成型的保密率优化方案。在系统模型中,考虑了一个由发射机、主用户、次用户、窃听者和智能反射面组成的多输入单输出通信场景。基于该模型,构建保密率优化问题,即在发射机总功率约束、主用户端干扰功率约束以及智能反射面单位模约束的条件下,通过联合优化主被动波束成型,最大化系统的保密率(Secrecy Rate,SR)。在实现过程中,由于公式化的问题非凸,因此使用交替优化的方法将原始问题分解为两个子问题进行优化,即发射机波束成型矩阵的优化以及IRS相移矩阵优化。针对发射机波束成型的矩阵优化,使用半定松弛法与逐次凸逼近法。接着,使用丁克尔巴赫法与逐次凸逼近的方法对IRS的相移矩阵进行优化。仿真结果表明,在含有窃听者的多输入单输出系统中,引入智能反射面实现主被动波束成型的优化有效提高了系统的保密率。
文摘为提高铁路供配电系统故障检测的及时性和精准性,基于窄带物联网(Narrow Band Internet of Things,NB-IoT)通信技术,提出了铁路供配电系统故障数据识别技术,并构建了完整的数据采集、传输和识别模型系统。研究结果表明,该技术在故障识别准确性和响应速度上表现突出,为复杂环境下铁路供配电系统的稳定运行提供了有效支持。
基金supported and funded by theDeanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2503).
文摘In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.
基金funded by Researchers Supporting Project Number(RSPD2025R947)King Saud University,Riyadh,Saudi Arabia.
文摘Effective resource management in the Internet of Things and fog computing is essential for efficient and scalable networks.However,existing methods often fail in dynamic and high-demand environments,leading to resource bottlenecks and increased energy consumption.This study aims to address these limitations by proposing the Quantum Inspired Adaptive Resource Management(QIARM)model,which introduces novel algorithms inspired by quantum principles for enhanced resource allocation.QIARM employs a quantum superposition-inspired technique for multi-state resource representation and an adaptive learning component to adjust resources in real time dynamically.In addition,an energy-aware scheduling module minimizes power consumption by selecting optimal configurations based on energy metrics.The simulation was carried out in a 360-minute environment with eight distinct scenarios.This study introduces a novel quantum-inspired resource management framework that achieves up to 98%task offload success and reduces energy consumption by 20%,addressing critical challenges of scalability and efficiency in dynamic fog computing environments.
文摘Modern industrial environments require uninterrupted machinery operation to maintain productivity standards while ensuring safety and minimizing costs.Conventional maintenance methods,such as reactive maintenance(i.e.,run to failure)or time-based preventive maintenance(i.e.,scheduled servicing),prove ineffective for complex systems with many Internet of Things(IoT)devices and sensors because they fall short in detecting faults at early stages when it is most crucial.This paper presents a predictive maintenance framework based on a hybrid deep learning model that integrates the capabilities of Long Short-Term Memory(LSTM)Networks and Convolutional Neural Networks(CNNs).The framework integrates spatial feature extraction and temporal sequence modeling to accurately classify the health state of industrial equipment into three categories,including Normal,Require Maintenance,and Failed.The framework uses a modular pipeline that includes IoT-enabled data collection along with secure transmission methods to manage cloud storage and provide real-time fault classification.The FD004 subset of the NASA C-MAPSS dataset,containing multivariate sensor readings from aircraft engines,serves as the training and evaluation data for the model.Experimental results show that the LSTM-CNN model outperforms baseline models such as LSTM-SVM and LSTM-RNN,achieving an overall average accuracy of 86.66%,precision of 86.00%,recall of 86.33%,and F1-score of 86.33%.Contrary to the previous LSTM-CNN-based predictive maintenance models that either provide a binary classification or rely on synthetically balanced data,our paper provides a three-class maintenance state(i.e.,Normal,Require Maintenance,and Failed)along with threshold-based labeling that retains the true nature of the degradation.In addition,our work also provides an IoT-to-cloud-based modular architecture for deployment.It offers Computerized Maintenance Management System(CMMS)integration,making our proposed solution not only technically sound but also practical and innovative.The solution achieves real-world industrial deployment readiness through its reliable performance alongside its scalable system design.
基金funding of the Deanship of Graduate Studies and Scientific Research,Jazan University,Saudi Arabia,through Project Number:ISP-2024.
文摘Efficient resource management within Internet of Things(IoT)environments remains a pressing challenge due to the increasing number of devices and their diverse functionalities.This study introduces a neural network-based model that uses Long-Short-Term Memory(LSTM)to optimize resource allocation under dynam-ically changing conditions.Designed to monitor the workload on individual IoT nodes,the model incorporates long-term data dependencies,enabling adaptive resource distribution in real time.The training process utilizes Min-Max normalization and grid search for hyperparameter tuning,ensuring high resource utilization and consistent performance.The simulation results demonstrate the effectiveness of the proposed method,outperforming the state-of-the-art approaches,including Dynamic and Efficient Enhanced Load-Balancing(DEELB),Optimized Scheduling and Collaborative Active Resource-management(OSCAR),Convolutional Neural Network with Monarch Butterfly Optimization(CNN-MBO),and Autonomic Workload Prediction and Resource Allocation for Fog(AWPR-FOG).For example,in scenarios with low system utilization,the model achieved a resource utilization efficiency of 95%while maintaining a latency of just 15 ms,significantly exceeding the performance of comparative methods.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2025R97)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)and Deep Learning(DL)techniques have demonstrated promising early detection capabilities.However,their effectiveness is limited when handling the vast volumes of IoT-generated data due to scalability constraints,high computational costs,and the costly time-intensive process of data labeling.To address these challenges,this study proposes a Federated Learning(FL)framework that leverages collaborative and hybrid supervised learning to enhance cyber threat detection in IoT networks.By employing Deep Neural Networks(DNNs)and decentralized model training,the approach reduces computational complexity while improving detection accuracy.The proposed model demonstrates robust performance,achieving accuracies of 94.34%,99.95%,and 87.94%on the publicly available kitsune,Bot-IoT,and UNSW-NB15 datasets,respectively.Furthermore,its ability to detect zero-day attacks is validated through evaluations on two additional benchmark datasets,TON-IoT and IoT-23,using a Deep Federated Learning(DFL)framework,underscoring the generalization and effectiveness of the model in heterogeneous and decentralized IoT environments.Experimental results demonstrate superior performance over existing methods,establishing the proposed framework as an efficient and scalable solution for IoT security.
文摘建筑信息模型(Building Information Modeling,BIM)、地理信息系统(Geographic Information System,GIS)与物联网(Internet of Things,IoT)技术的发展,为构建面向智慧城市的一体化运维管理平台提供了技术基础。文章围绕智慧城市治理需求,提出BIM-GIS-IoT三维融合的运维管理平台架构,重点研究多源异构数据集成与模型融合、IoT实时监测、三维可视化展示、设施健康诊断与预测性运维等关键技术,以提升城市基础设施的运维效率、风险预警能力与决策科学性,为智慧城市建设提供新的技术路径与管理模式。
基金supported in part by theHubei Engineering Research Center for BDS-CloudHigh-Precision Deformation Monitoring Open Funding(No.HBBDGJ202507Y)the National Natural Science Foundation of China(No.62377037).
文摘The large-scale deployment of Internet of Things(IoT)technology across various aspects of daily life has significantly propelled the intelligent development of society.Among them,the integration of IoT and named data networks(NDNs)reduces network complexity and provides practical directions for content-oriented network design.However,ensuring data integrity in NDN-IoT applications remains a challenging issue.Very recently,Wang et al.(Entropy,27(5),471(2025))designed a certificateless aggregate signature(CLAS)scheme for NDN-IoT environments.Wang et al.stated that their construction was provably secure under various types of security attacks.Using theoretical analysis methods,in this work,we reveal that their CLAS design fails to meet unforgeability,a core security requirement for CLAS schemes.In particular,we demonstrate that their scheme is vulnerable to amalicious public-key replacement attack,enabling an adversary to produce authentic signatures for arbitrary fraudulent messages.Therefore,Wang et al.’s design cannot achieve its goal.To address the issue,we systematically examine the root causes behind the vulnerability and propose a security-enhanced CLAS construction for NDN-IoT environments.We prove the security ofour improveddesignunder the standard security assumptionandalsoanalyze its practicalperformanceby comparing the computational and communication costs with several related works.The comparison results show the practicality of our design.
基金supported by the National Key Research and Development Program of China under grant 2022YFF0902701the National Natural Science Foundation of China under grant U21A20468,61972043,61921003+1 种基金Zhejiang Lab under grant 2021PD0AB 02the Fundamental Research Funds for the Central Universities under grant 2020XD-A07-1.
文摘Ensuring an information fabric safe is critical and mandatory.For its related Internet of Things(IoT)service system running on the open Internet,existing host-based monitoring methods may fail due to only inspecting software,and the physical system may not be able to be protected.In this paper,a nonintrusive virtual machine(VM)-based runtime protection framework is provided to protect the physical system with the isolated IoT services as a controlling means.Compared with existing solutions,the framework gets inconsistent and untrusted observation knowledge from multiple observation sources,and enforces property policies concurrently and incrementally in a competing-game way to avoid compositional problems.In addition,the monitoring is implemented without any modification to the protected system.Experiments are conducted to validate the proposed techniques.
基金This work is part of the‘Intelligent and Cyber-Secure Platform for Adaptive Optimization in the Simultaneous Operation of Heterogeneous Autonomous Robots(PICRAH4.0)’with reference MIG-20232082,funded by MCIN/AEI/10.13039/501100011033supported by the Universidad Internacional de La Rioja(UNIR)through the Precompetitive Research Project entitled“Nuevos Horizontes en Internet de las Cosas y NewSpace(NEWIOT)”,reference PP-2024-13,funded under the 2024 Call for Research Projects.
文摘This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the first proposal to integrate IOTA’s Directed Acyclic Graph(DAG)-based identity framework into satellite IoT environments,enabling lightweight and distributed authentication under intermittent connectivity.The system leverages Decentralized Identifiers(DIDs)and Verifiable Credentials(VCs)over the Tangle,eliminating the need for mining and sequential blocks.An identity management workflow is implemented that supports the creation,validation,deactivation,and reactivation of IoT devices,and is experimentally validated on the Shimmer Testnet.Three metrics are defined and measured:resolution time,deactivation time,and reactivation time.To improve robustness,an algorithmic optimization is introduced that minimizes communication overhead and reduces latency during deactivation.The experimental results are compared with orbital simulations of satellite revisit times to assess operational feasibility.Unlike blockchain-based approaches,which typically suffer from high confirmation delays and scalability constraints,the proposed DAG architecture provides fast,cost-free operations suitable for resource-constrained IoT devices.The results show that authentication can be efficiently performed within satellite connectivity windows,positioning IOTA Identity as a viable solution for secure and scalable IoT authentication in LEO satellite networks.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2023-00235509Development of security monitoring technology based network behavior against encrypted cyber threats in ICT convergence environment).
文摘With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.