Accurate prediction of cloud resource utilization is critical.It helps improve service quality while avoiding resource waste and shortages.However,the time series of resource usage in cloud computing systems often exh...Accurate prediction of cloud resource utilization is critical.It helps improve service quality while avoiding resource waste and shortages.However,the time series of resource usage in cloud computing systems often exhibit multidimensionality,nonlinearity,and high volatility,making the high-precision prediction of resource utilization a complex and challenging task.At present,cloud computing resource prediction methods include traditional statistical models,hybrid approaches combining machine learning and classical models,and deep learning techniques.Traditional statistical methods struggle with nonlinear predictions,hybrid methods face challenges in feature extraction and long-term dependencies,and deep learning methods incur high computational costs.The above methods are insufficient to achieve high-precision resource prediction in cloud computing systems.Therefore,we propose a new time series prediction model,called SDVformer,which is based on the Informer model by integrating the Savitzky-Golay(SG)filters,a novel Discrete-Variation Self-Attention(DVSA)mechanism,and a type-aware mixture of experts(T-MOE)framework.The SG filter is designed to reduce noise and enhance the feature representation of input data.The DVSA mechanism is proposed to optimize the selection of critical features to reduce computational complexity.The T-MOE framework is designed to adjust the model structure based on different resource characteristics,thereby improving prediction accuracy and adaptability.Experimental results show that our proposed SDVformer significantly outperforms baseline models,including Recurrent Neural Network(RNN),Long Short-Term Memory(LSTM),and Informer in terms of prediction precision,on both the Alibaba public dataset and the dataset collected by Beijing Jiaotong University(BJTU).Particularly compared with the Informer model,the average Mean Squared Error(MSE)of SDVformer decreases by about 80%,fully demonstrating its advantages in complex time series prediction tasks in cloud computing systems.展开更多
文摘Accurate prediction of cloud resource utilization is critical.It helps improve service quality while avoiding resource waste and shortages.However,the time series of resource usage in cloud computing systems often exhibit multidimensionality,nonlinearity,and high volatility,making the high-precision prediction of resource utilization a complex and challenging task.At present,cloud computing resource prediction methods include traditional statistical models,hybrid approaches combining machine learning and classical models,and deep learning techniques.Traditional statistical methods struggle with nonlinear predictions,hybrid methods face challenges in feature extraction and long-term dependencies,and deep learning methods incur high computational costs.The above methods are insufficient to achieve high-precision resource prediction in cloud computing systems.Therefore,we propose a new time series prediction model,called SDVformer,which is based on the Informer model by integrating the Savitzky-Golay(SG)filters,a novel Discrete-Variation Self-Attention(DVSA)mechanism,and a type-aware mixture of experts(T-MOE)framework.The SG filter is designed to reduce noise and enhance the feature representation of input data.The DVSA mechanism is proposed to optimize the selection of critical features to reduce computational complexity.The T-MOE framework is designed to adjust the model structure based on different resource characteristics,thereby improving prediction accuracy and adaptability.Experimental results show that our proposed SDVformer significantly outperforms baseline models,including Recurrent Neural Network(RNN),Long Short-Term Memory(LSTM),and Informer in terms of prediction precision,on both the Alibaba public dataset and the dataset collected by Beijing Jiaotong University(BJTU).Particularly compared with the Informer model,the average Mean Squared Error(MSE)of SDVformer decreases by about 80%,fully demonstrating its advantages in complex time series prediction tasks in cloud computing systems.