With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these d...With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these devices often handle personal information under limited resources,cryptographic algorithms must be executed efficiently.Their computational characteristics strongly affect system performance,making it necessary to analyze resource impact and predict usage under diverse configurations.In this paper,we analyze the phase-level resource usage of AES variants,ChaCha20,ECC,and RSA on an edge device and develop a prediction model.We apply these algorithms under varying parallelism levels and execution strategies across key generation,encryption,and decryption phases.Based on the analysis,we train a unified Random Forest model using execution context and temporal features,achieving R2 values up to 0.994 for power and 0.988 for temperature.Furthermore,the model maintains practical predictive performance even for cryptographic algorithms not included during training,demonstrating its ability to generalize across distinct computational characteristics.Our proposed approach reveals how execution characteristics and resource usage interacts,supporting proactive resource planning and efficient deployment of cryptographic workloads on edge devices.As our approach is grounded in phase-level computational characteristics rather than in any single algorithm,it provides generalizable insights that can be extended to a broader range of cryptographic algorithms that exhibit comparable phase-level execution patterns and to heterogeneous edge architectures.展开更多
基金supported in part by the National Research Foundation of Korea(NRF)(No.RS-2025-00554650)supported by the Chung-Ang University research grant in 2024。
文摘With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these devices often handle personal information under limited resources,cryptographic algorithms must be executed efficiently.Their computational characteristics strongly affect system performance,making it necessary to analyze resource impact and predict usage under diverse configurations.In this paper,we analyze the phase-level resource usage of AES variants,ChaCha20,ECC,and RSA on an edge device and develop a prediction model.We apply these algorithms under varying parallelism levels and execution strategies across key generation,encryption,and decryption phases.Based on the analysis,we train a unified Random Forest model using execution context and temporal features,achieving R2 values up to 0.994 for power and 0.988 for temperature.Furthermore,the model maintains practical predictive performance even for cryptographic algorithms not included during training,demonstrating its ability to generalize across distinct computational characteristics.Our proposed approach reveals how execution characteristics and resource usage interacts,supporting proactive resource planning and efficient deployment of cryptographic workloads on edge devices.As our approach is grounded in phase-level computational characteristics rather than in any single algorithm,it provides generalizable insights that can be extended to a broader range of cryptographic algorithms that exhibit comparable phase-level execution patterns and to heterogeneous edge architectures.