This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the fi...This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the first proposal to integrate IOTA’s Directed Acyclic Graph(DAG)-based identity framework into satellite IoT environments,enabling lightweight and distributed authentication under intermittent connectivity.The system leverages Decentralized Identifiers(DIDs)and Verifiable Credentials(VCs)over the Tangle,eliminating the need for mining and sequential blocks.An identity management workflow is implemented that supports the creation,validation,deactivation,and reactivation of IoT devices,and is experimentally validated on the Shimmer Testnet.Three metrics are defined and measured:resolution time,deactivation time,and reactivation time.To improve robustness,an algorithmic optimization is introduced that minimizes communication overhead and reduces latency during deactivation.The experimental results are compared with orbital simulations of satellite revisit times to assess operational feasibility.Unlike blockchain-based approaches,which typically suffer from high confirmation delays and scalability constraints,the proposed DAG architecture provides fast,cost-free operations suitable for resource-constrained IoT devices.The results show that authentication can be efficiently performed within satellite connectivity windows,positioning IOTA Identity as a viable solution for secure and scalable IoT authentication in LEO satellite networks.展开更多
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp...With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.展开更多
The integration of machine learning(ML)technology with Internet of Things(IoT)systems produces essential changes in healthcare operations.Healthcare personnel can track patients around the clock thanks to healthcare I...The integration of machine learning(ML)technology with Internet of Things(IoT)systems produces essential changes in healthcare operations.Healthcare personnel can track patients around the clock thanks to healthcare IoT(H-IoT)technology,which also provides proactive statistical findings and precise medical diagnoses that enhance healthcare performance.This study examines how ML might support IoT-based health care systems,namely in the areas of prognostic systems,disease detection,patient tracking,and healthcare operations control.The study looks at the benefits and drawbacks of several machine learning techniques for H-IoT applications.It also examines the fundamental problems,such as data security and cyberthreats,as well as the high processing demands that these systems face.Alongside this,the essay discusses the advantages of all the technologies,including machine learning,deep learning,and the Internet of Things,as well as the significant difficulties and problems that arise when integrating the technology into healthcare forecasts.展开更多
The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)an...The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)and Deep Learning(DL)techniques have demonstrated promising early detection capabilities.However,their effectiveness is limited when handling the vast volumes of IoT-generated data due to scalability constraints,high computational costs,and the costly time-intensive process of data labeling.To address these challenges,this study proposes a Federated Learning(FL)framework that leverages collaborative and hybrid supervised learning to enhance cyber threat detection in IoT networks.By employing Deep Neural Networks(DNNs)and decentralized model training,the approach reduces computational complexity while improving detection accuracy.The proposed model demonstrates robust performance,achieving accuracies of 94.34%,99.95%,and 87.94%on the publicly available kitsune,Bot-IoT,and UNSW-NB15 datasets,respectively.Furthermore,its ability to detect zero-day attacks is validated through evaluations on two additional benchmark datasets,TON-IoT and IoT-23,using a Deep Federated Learning(DFL)framework,underscoring the generalization and effectiveness of the model in heterogeneous and decentralized IoT environments.Experimental results demonstrate superior performance over existing methods,establishing the proposed framework as an efficient and scalable solution for IoT security.展开更多
Internet of Things(IoT)refers to the infrastructures that connect smart devices to the Internet,operating autonomously.This connectivitymakes it possible to harvest vast quantities of data,creating new opportunities f...Internet of Things(IoT)refers to the infrastructures that connect smart devices to the Internet,operating autonomously.This connectivitymakes it possible to harvest vast quantities of data,creating new opportunities for the emergence of unprecedented knowledge.To ensure IoT securit,various approaches have been implemented,such as authentication,encoding,as well as devices to guarantee data integrity and availability.Among these approaches,Intrusion Detection Systems(IDS)is an actual security solution,whose performance can be enhanced by integrating various algorithms,including Machine Learning(ML)and Deep Learning(DL),enabling proactive and accurate detection of threats.This study proposes to optimize the performance of network IDS using an ensemble learning method based on a voting classification algorithm.By combining the strengths of three powerful algorithms,Random Forest(RF),K-Nearest Neighbors(KNN),and Support Vector Machine(SVM)to detect both normal behavior and different categories of attack.Our analysis focuses primarily on the NSL-KDD dataset,while also integrating the recent Edge-IIoT dataset,tailored to industrial IoT environments.Experimental results show significant enhancements on the Edge-IIoT and NSL-KDD datasets,reaching accuracy levels between 72%to 99%,with precision between 87%and 99%,while recall values and F1-scores are also between 72%and 99%,for both normal and attack detection.Despite the promising results of this study,it suffers from certain limitations,notably the use of specific datasets and the lack of evaluations in a variety of environments.Future work could include applying this model to various datasets and evaluating more advanced ensemble strategies,with the aim of further enhancing the effectiveness of IDS.展开更多
The rapid advancement of the Internet of Things(IoT)has led to the proliferation of connected devices across various domains,including smart cities,industrial automation,and healthcare.However,interoperability challen...The rapid advancement of the Internet of Things(IoT)has led to the proliferation of connected devices across various domains,including smart cities,industrial automation,and healthcare.However,interoperability challenges arising from heterogeneous communication protocols,diverse data formats,and fragmented standardization efforts hinder the seamless integration of IoT systems.This paper explores the current state of IoT interoperability,analyzing key challenges,existing standardization initiatives,and emerging technological solutions.We examine the role of middleware,gateway solutions,artificial intelligence(AI),blockchain,and edge computing in facilitating interoperability.Furthermore,we provide a comparative analysis of major IoT standards and discuss the potential for greater convergence among standardization efforts.The findings highlight that while significant progress has been made,a unified and widely accepted interoperability framework remains elusive.Addressing these challenges requires collaborative efforts among industry stakeholders,researchers,and policymakers to establish robust and scalable interoperability solutions,ensuring the continued growth and efficiency of IoT ecosystems.展开更多
Due to their resource constraints,Internet of Things(IoT)devices require authentication mechanisms that are both secure and efficient.Elliptic curve cryptography(ECC)meets these needs by providing strong security with...Due to their resource constraints,Internet of Things(IoT)devices require authentication mechanisms that are both secure and efficient.Elliptic curve cryptography(ECC)meets these needs by providing strong security with shorter key lengths,which significantly reduces the computational overhead required for authentication algorithms.This paper introduces a novel ECC-based IoT authentication system utilizing our previously proposed efficient mapping and reverse mapping operations on elliptic curves over prime fields.By reducing reliance on costly point multiplication,the proposed algorithm significantly improves execution time,storage requirements,and communication cost across varying security levels.The proposed authentication protocol demonstrates superior performance when benchmarked against relevant ECC-based schemes,achieving reductions of up to 35.83%in communication overhead,62.51%in device-side storage consumption,and 71.96%in computational cost.The security robustness of the scheme is substantiated through formal analysis using the Automated Validation of Internet Security Protocols and Applications(AVISPA)tool and Burrows-Abadir-Needham(BAN)logic,complemented by a comprehensive informal analysis that confirms its resilience against various attack models,including impersonation,replay,and man-in-the-middle attacks.Empirical evaluation under simulated conditions demonstrates notable gains in efficiency and security.While these results indicate the protocol’s strong potential for scalable IoT deployments,further validation on real-world embedded platforms is required to confirm its applicability and robustness at scale.展开更多
The exponential expansion of the Internet of Things(IoT),Industrial Internet of Things(IIoT),and Transportation Management of Things(TMoT)produces vast amounts of real-time streaming data.Ensuring system dependability...The exponential expansion of the Internet of Things(IoT),Industrial Internet of Things(IIoT),and Transportation Management of Things(TMoT)produces vast amounts of real-time streaming data.Ensuring system dependability,operational efficiency,and security depends on the identification of anomalies in these dynamic and resource-constrained systems.Due to their high computational requirements and inability to efficiently process continuous data streams,traditional anomaly detection techniques often fail in IoT systems.This work presents a resource-efficient adaptive anomaly detection model for real-time streaming data in IoT systems.Extensive experiments were carried out on multiple real-world datasets,achieving an average accuracy score of 96.06%with an execution time close to 7.5 milliseconds for each individual streaming data point,demonstrating its potential for real-time,resourceconstrained applications.The model uses Principal Component Analysis(PCA)for dimensionality reduction and a Z-score technique for anomaly detection.It maintains a low computational footprint with a sliding window mechanism,enabling incremental data processing and identification of both transient and sustained anomalies without storing historical data.The system uses a Multivariate Linear Regression(MLR)based imputation technique that estimates missing or corrupted sensor values,preserving data integrity prior to anomaly detection.The suggested solution is appropriate for many uses in smart cities,industrial automation,environmental monitoring,IoT security,and intelligent transportation systems,and is particularly well-suited for resource-constrained edge devices.展开更多
Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
针对当前换流站一次设备温度监测中非接触式红外测温存在成本高、准确率低、时效性差等问题,提出一种面向高压场景的温度监控方案。该方案结合5G无源物联网(Passive Internet of Things,P-IoT)技术与Transformer模型。通过在高压设备关...针对当前换流站一次设备温度监测中非接触式红外测温存在成本高、准确率低、时效性差等问题,提出一种面向高压场景的温度监控方案。该方案结合5G无源物联网(Passive Internet of Things,P-IoT)技术与Transformer模型。通过在高压设备关键部位部署无源温度传感器,利用反向散射通信技术实现低功耗数据传输,并借助5G网络将数据传输至边缘服务器处理。随后,采用基于Transformer的异常检测模型,通过多头注意力机制有效捕捉温度数据中的时序特征,结合最大池化操作实现对异常温度的准确识别与预警。实验结果表明,该方案在高电磁干扰环境下的传输成功率达到99.0%,在温度异常检测任务中的精度、召回率和F1值分别为98.7%、97.5%和96.9%,显著优于LSTM和GRU等传统时序模型。研究成果验证了所提方法在复杂高压场景下的适用性和稳定性,可为后续在更高电压等级的特高压设备中推广应用奠定技术基础。展开更多
基金This work is part of the‘Intelligent and Cyber-Secure Platform for Adaptive Optimization in the Simultaneous Operation of Heterogeneous Autonomous Robots(PICRAH4.0)’with reference MIG-20232082,funded by MCIN/AEI/10.13039/501100011033supported by the Universidad Internacional de La Rioja(UNIR)through the Precompetitive Research Project entitled“Nuevos Horizontes en Internet de las Cosas y NewSpace(NEWIOT)”,reference PP-2024-13,funded under the 2024 Call for Research Projects.
文摘This work evaluates an architecture for decentralized authentication of Internet of Things(IoT)devices in Low Earth Orbit(LEO)satellite networks using IOTA Identity technology.To the best of our knowledge,it is the first proposal to integrate IOTA’s Directed Acyclic Graph(DAG)-based identity framework into satellite IoT environments,enabling lightweight and distributed authentication under intermittent connectivity.The system leverages Decentralized Identifiers(DIDs)and Verifiable Credentials(VCs)over the Tangle,eliminating the need for mining and sequential blocks.An identity management workflow is implemented that supports the creation,validation,deactivation,and reactivation of IoT devices,and is experimentally validated on the Shimmer Testnet.Three metrics are defined and measured:resolution time,deactivation time,and reactivation time.To improve robustness,an algorithmic optimization is introduced that minimizes communication overhead and reduces latency during deactivation.The experimental results are compared with orbital simulations of satellite revisit times to assess operational feasibility.Unlike blockchain-based approaches,which typically suffer from high confirmation delays and scalability constraints,the proposed DAG architecture provides fast,cost-free operations suitable for resource-constrained IoT devices.The results show that authentication can be efficiently performed within satellite connectivity windows,positioning IOTA Identity as a viable solution for secure and scalable IoT authentication in LEO satellite networks.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2023-00235509Development of security monitoring technology based network behavior against encrypted cyber threats in ICT convergence environment).
文摘With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.
文摘The integration of machine learning(ML)technology with Internet of Things(IoT)systems produces essential changes in healthcare operations.Healthcare personnel can track patients around the clock thanks to healthcare IoT(H-IoT)technology,which also provides proactive statistical findings and precise medical diagnoses that enhance healthcare performance.This study examines how ML might support IoT-based health care systems,namely in the areas of prognostic systems,disease detection,patient tracking,and healthcare operations control.The study looks at the benefits and drawbacks of several machine learning techniques for H-IoT applications.It also examines the fundamental problems,such as data security and cyberthreats,as well as the high processing demands that these systems face.Alongside this,the essay discusses the advantages of all the technologies,including machine learning,deep learning,and the Internet of Things,as well as the significant difficulties and problems that arise when integrating the technology into healthcare forecasts.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2025R97)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The exponential growth of the Internet of Things(IoT)has introduced significant security challenges,with zero-day attacks emerging as one of the most critical and challenging threats.Traditional Machine Learning(ML)and Deep Learning(DL)techniques have demonstrated promising early detection capabilities.However,their effectiveness is limited when handling the vast volumes of IoT-generated data due to scalability constraints,high computational costs,and the costly time-intensive process of data labeling.To address these challenges,this study proposes a Federated Learning(FL)framework that leverages collaborative and hybrid supervised learning to enhance cyber threat detection in IoT networks.By employing Deep Neural Networks(DNNs)and decentralized model training,the approach reduces computational complexity while improving detection accuracy.The proposed model demonstrates robust performance,achieving accuracies of 94.34%,99.95%,and 87.94%on the publicly available kitsune,Bot-IoT,and UNSW-NB15 datasets,respectively.Furthermore,its ability to detect zero-day attacks is validated through evaluations on two additional benchmark datasets,TON-IoT and IoT-23,using a Deep Federated Learning(DFL)framework,underscoring the generalization and effectiveness of the model in heterogeneous and decentralized IoT environments.Experimental results demonstrate superior performance over existing methods,establishing the proposed framework as an efficient and scalable solution for IoT security.
文摘Internet of Things(IoT)refers to the infrastructures that connect smart devices to the Internet,operating autonomously.This connectivitymakes it possible to harvest vast quantities of data,creating new opportunities for the emergence of unprecedented knowledge.To ensure IoT securit,various approaches have been implemented,such as authentication,encoding,as well as devices to guarantee data integrity and availability.Among these approaches,Intrusion Detection Systems(IDS)is an actual security solution,whose performance can be enhanced by integrating various algorithms,including Machine Learning(ML)and Deep Learning(DL),enabling proactive and accurate detection of threats.This study proposes to optimize the performance of network IDS using an ensemble learning method based on a voting classification algorithm.By combining the strengths of three powerful algorithms,Random Forest(RF),K-Nearest Neighbors(KNN),and Support Vector Machine(SVM)to detect both normal behavior and different categories of attack.Our analysis focuses primarily on the NSL-KDD dataset,while also integrating the recent Edge-IIoT dataset,tailored to industrial IoT environments.Experimental results show significant enhancements on the Edge-IIoT and NSL-KDD datasets,reaching accuracy levels between 72%to 99%,with precision between 87%and 99%,while recall values and F1-scores are also between 72%and 99%,for both normal and attack detection.Despite the promising results of this study,it suffers from certain limitations,notably the use of specific datasets and the lack of evaluations in a variety of environments.Future work could include applying this model to various datasets and evaluating more advanced ensemble strategies,with the aim of further enhancing the effectiveness of IDS.
文摘The rapid advancement of the Internet of Things(IoT)has led to the proliferation of connected devices across various domains,including smart cities,industrial automation,and healthcare.However,interoperability challenges arising from heterogeneous communication protocols,diverse data formats,and fragmented standardization efforts hinder the seamless integration of IoT systems.This paper explores the current state of IoT interoperability,analyzing key challenges,existing standardization initiatives,and emerging technological solutions.We examine the role of middleware,gateway solutions,artificial intelligence(AI),blockchain,and edge computing in facilitating interoperability.Furthermore,we provide a comparative analysis of major IoT standards and discuss the potential for greater convergence among standardization efforts.The findings highlight that while significant progress has been made,a unified and widely accepted interoperability framework remains elusive.Addressing these challenges requires collaborative efforts among industry stakeholders,researchers,and policymakers to establish robust and scalable interoperability solutions,ensuring the continued growth and efficiency of IoT ecosystems.
文摘Due to their resource constraints,Internet of Things(IoT)devices require authentication mechanisms that are both secure and efficient.Elliptic curve cryptography(ECC)meets these needs by providing strong security with shorter key lengths,which significantly reduces the computational overhead required for authentication algorithms.This paper introduces a novel ECC-based IoT authentication system utilizing our previously proposed efficient mapping and reverse mapping operations on elliptic curves over prime fields.By reducing reliance on costly point multiplication,the proposed algorithm significantly improves execution time,storage requirements,and communication cost across varying security levels.The proposed authentication protocol demonstrates superior performance when benchmarked against relevant ECC-based schemes,achieving reductions of up to 35.83%in communication overhead,62.51%in device-side storage consumption,and 71.96%in computational cost.The security robustness of the scheme is substantiated through formal analysis using the Automated Validation of Internet Security Protocols and Applications(AVISPA)tool and Burrows-Abadir-Needham(BAN)logic,complemented by a comprehensive informal analysis that confirms its resilience against various attack models,including impersonation,replay,and man-in-the-middle attacks.Empirical evaluation under simulated conditions demonstrates notable gains in efficiency and security.While these results indicate the protocol’s strong potential for scalable IoT deployments,further validation on real-world embedded platforms is required to confirm its applicability and robustness at scale.
基金funded by the Ongoing Research Funding Program(ORF-2025-890)King Saud University,Riyadh,Saudi Arabia and was supported by the Competitive Research Fund of theUniversity of Aizu,Japan.
文摘The exponential expansion of the Internet of Things(IoT),Industrial Internet of Things(IIoT),and Transportation Management of Things(TMoT)produces vast amounts of real-time streaming data.Ensuring system dependability,operational efficiency,and security depends on the identification of anomalies in these dynamic and resource-constrained systems.Due to their high computational requirements and inability to efficiently process continuous data streams,traditional anomaly detection techniques often fail in IoT systems.This work presents a resource-efficient adaptive anomaly detection model for real-time streaming data in IoT systems.Extensive experiments were carried out on multiple real-world datasets,achieving an average accuracy score of 96.06%with an execution time close to 7.5 milliseconds for each individual streaming data point,demonstrating its potential for real-time,resourceconstrained applications.The model uses Principal Component Analysis(PCA)for dimensionality reduction and a Z-score technique for anomaly detection.It maintains a low computational footprint with a sliding window mechanism,enabling incremental data processing and identification of both transient and sustained anomalies without storing historical data.The system uses a Multivariate Linear Regression(MLR)based imputation technique that estimates missing or corrupted sensor values,preserving data integrity prior to anomaly detection.The suggested solution is appropriate for many uses in smart cities,industrial automation,environmental monitoring,IoT security,and intelligent transportation systems,and is particularly well-suited for resource-constrained edge devices.
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
文摘针对当前换流站一次设备温度监测中非接触式红外测温存在成本高、准确率低、时效性差等问题,提出一种面向高压场景的温度监控方案。该方案结合5G无源物联网(Passive Internet of Things,P-IoT)技术与Transformer模型。通过在高压设备关键部位部署无源温度传感器,利用反向散射通信技术实现低功耗数据传输,并借助5G网络将数据传输至边缘服务器处理。随后,采用基于Transformer的异常检测模型,通过多头注意力机制有效捕捉温度数据中的时序特征,结合最大池化操作实现对异常温度的准确识别与预警。实验结果表明,该方案在高电磁干扰环境下的传输成功率达到99.0%,在温度异常检测任务中的精度、召回率和F1值分别为98.7%、97.5%和96.9%,显著优于LSTM和GRU等传统时序模型。研究成果验证了所提方法在复杂高压场景下的适用性和稳定性,可为后续在更高电压等级的特高压设备中推广应用奠定技术基础。