Accurate forecasting of buildings'energy demand is essential for building operators to manage loads and resources efficiently,and for grid operators to balance local production with demand.However,nowadays models ...Accurate forecasting of buildings'energy demand is essential for building operators to manage loads and resources efficiently,and for grid operators to balance local production with demand.However,nowadays models still struggle to capture nonlinear relationships influenced by external factors like weather and consumer behavior,assume constant variance in energy data over time,and often fail to model sequential data.To address these limitations,we propose a hybrid Transformer-based model with Liquid Neural Networks and learnable encodings for building energy forecasting.The model leverages Dense Layers to learn non-linear mappings to create embeddings that capture underlying patterns in time series energy data.Additionally,a Convolutional Neural Network encoder is integrated to enhance the model's ability to understand temporal dynamics through spatial mappings.To address the limitations of classic attention mechanisms,we implement a reservoir processing module using Liquid Neural Networks which introduces a controlled non-linearity through dynamic reservoir computing,enabling the model to capture complex patterns in the data.For model evaluation,we utilized both pilot data and state-of-the-art datasets to determine the model's performance across various building contexts,including large apartment and commercial buildings and small households,with and without on-site energy production.The proposed transformer model demonstrates good predictive accuracy and training time efficiency across various types of buildings and testing configurations.Specifically,SMAPE scores indicate a reduction in prediction error,with improvements ranging from 1.5%to 50%over basic transformer,LSTM and ANN models while the higher R²values further confirm the model's reliability in capturing energy time series variance.The 8%improvement in training time over the basic transformer model,highlights the hybrid model computational efficiency without compromising accuracy.展开更多
基金the DEDALUS project grant number 101103998 funded by the European Commission as part of the Horizon Europe Framework Programme and within Ministry of Research,Innovation and Digitization,CNCS/CCCDI-UEFISCDI,project number PN-IV-P8-8.1-PRE-HE-ORG-2023-0111,within PNCDI IV.
文摘Accurate forecasting of buildings'energy demand is essential for building operators to manage loads and resources efficiently,and for grid operators to balance local production with demand.However,nowadays models still struggle to capture nonlinear relationships influenced by external factors like weather and consumer behavior,assume constant variance in energy data over time,and often fail to model sequential data.To address these limitations,we propose a hybrid Transformer-based model with Liquid Neural Networks and learnable encodings for building energy forecasting.The model leverages Dense Layers to learn non-linear mappings to create embeddings that capture underlying patterns in time series energy data.Additionally,a Convolutional Neural Network encoder is integrated to enhance the model's ability to understand temporal dynamics through spatial mappings.To address the limitations of classic attention mechanisms,we implement a reservoir processing module using Liquid Neural Networks which introduces a controlled non-linearity through dynamic reservoir computing,enabling the model to capture complex patterns in the data.For model evaluation,we utilized both pilot data and state-of-the-art datasets to determine the model's performance across various building contexts,including large apartment and commercial buildings and small households,with and without on-site energy production.The proposed transformer model demonstrates good predictive accuracy and training time efficiency across various types of buildings and testing configurations.Specifically,SMAPE scores indicate a reduction in prediction error,with improvements ranging from 1.5%to 50%over basic transformer,LSTM and ANN models while the higher R²values further confirm the model's reliability in capturing energy time series variance.The 8%improvement in training time over the basic transformer model,highlights the hybrid model computational efficiency without compromising accuracy.