Deep reinforcement learning(DRL)is a suitable approach to handle uncertainty in managing the energy consumption of buildings with energy storage systems.Conventionally,DRL agents are trained by randomly selecting samp...Deep reinforcement learning(DRL)is a suitable approach to handle uncertainty in managing the energy consumption of buildings with energy storage systems.Conventionally,DRL agents are trained by randomly selecting samples from a data set,which can result in overexposure to some data categories and under/no exposure to other data categories.Thus,the trained model may be biased towards some data groups and underperform(provide suboptimal results)for data groups to which it was less exposed.To address this issue,diversity in experience-based DRL agent training framework is proposed in this study.This approach ensures the exposure of agents to all types of data.The proposed framework is implemented in two steps.In the first step,raw data are grouped into different clusters using the K-means clustering method.The clustered data is then arranged by stacking the data of one cluster on top of another.In the second step,a selection algorithm is proposed to select data from each cluster to train the DRL agent.The frequency of selection from each cluster is in proportion to the number of data points in that cluster and therefore named the proportional selection method.To analyze the performance of the proposed approach and compare the results with the conventional random selection method,two indices are proposed in this study:the flatness index and the divergence index.The model is trained using different data sets(1-year,3-year,and 5-year)and also with the inclusion of solar photovoltaics.The simulation results confirmed the superior performance of the proposed approach to flatten the building’s load curve by optimally operating the energy storage system.展开更多
As the energy landscape evolves towards sustainability,the accelerating integration of distributed energy resources poses challenges to the operability and reliability of the electricity grid.One significant aspect of...As the energy landscape evolves towards sustainability,the accelerating integration of distributed energy resources poses challenges to the operability and reliability of the electricity grid.One significant aspect of this issue is the notable increase in net load variability at the grid edge.Transactive energy,implemented through local energy markets,has recently garnered attention as a promising solution to address the grid challenges in the form of decentralized,indirect demand response on a community level.Model-free control approaches,such as deep reinforcement learning(DRL),show promise for the decentralized automation of participation within this context.Existing studies at the intersection of transactive energy and model-free control primarily focus on socioeconomic and self-consumption metrics,overlooking the crucial goal of reducing community-level net load variability.This study addresses this gap by training a set of deep reinforcement learning agents to automate end-user participation in an economy-driven,autonomous local energy market(ALEX).In this setting,agents do not share information and only prioritize individual bill optimization.The study unveils a clear correlation between bill reduction and reduced net load variability.The impact on net load variability is assessed over various time horizons using metrics such as ramping rate,daily and monthly load factor,as well as daily average and total peak export and import on an open-source dataset.To examine the performance of the proposed DRL method,its agents are benchmarked against a optimal near-dynamic programming method,using a no-control scenario as the baseline.The dynamic programming benchmark reduces average daily import,export,and peak demand by 22.05%,83.92%,and 24.09%,respectively.The RL agents demonstrate comparable or superior performance,with improvements of 21.93%,84.46%,and 27.02%on these metrics.This demonstrates that DRL can be effectively employed for such tasks,as they are inherently scalable with near-optimal performance in decentralized grid management.展开更多
Local energy markets are emerging as a tool for coordinating generation, storage, and consumption of energyfrom distributed resources. In combination with automation, they promise to provide an effective energymanagem...Local energy markets are emerging as a tool for coordinating generation, storage, and consumption of energyfrom distributed resources. In combination with automation, they promise to provide an effective energymanagement framework that is fair and brings system-level savings. The cooperative–competitive natureof energy markets calls for multi-agent based automation with learning energy trading agents. However,depending on the dynamics of the agent–environment interaction, this approach may yield unintended behaviorof market participants. Thus, the design of market mechanisms suitable for reinforcement learning agentsmust take into account this interplay. This article introduces autonomous local energy exchange (ALEX) asan experimental framework that combines multi-agent learning and double auction mechanism. Participantsdetermine their internal price signals and make energy management decisions through market interactions,rather than relying on predetermined external price signals. The main contribution of this article is examinationof compatibility between specific market elements and independent learning agents. Effects of different marketproperties are evaluated through simulation experiments, and the results are used for determine a suitablemarket design. The results show that market truthfulness maintains demand-response functionality, while weakbudget balancing provides a strong reinforcement signal for the learning agents. The resulting agent behavioris compared with two baselines: net billing and time-of-use rates. The ALEX-based pricing is more responsiveto fluctuations in the community net load compared to the time-of-use. The more accurate accounting ofrenewable energy usage reduced bills by a median 38.8% compared to net billing, confirming the ability tobetter facilitate demand response.展开更多
基金supported by the Natural Sciences and Engineering Research Council(NSERC)of Canada,grant number RGPIN-2017-05866.
文摘Deep reinforcement learning(DRL)is a suitable approach to handle uncertainty in managing the energy consumption of buildings with energy storage systems.Conventionally,DRL agents are trained by randomly selecting samples from a data set,which can result in overexposure to some data categories and under/no exposure to other data categories.Thus,the trained model may be biased towards some data groups and underperform(provide suboptimal results)for data groups to which it was less exposed.To address this issue,diversity in experience-based DRL agent training framework is proposed in this study.This approach ensures the exposure of agents to all types of data.The proposed framework is implemented in two steps.In the first step,raw data are grouped into different clusters using the K-means clustering method.The clustered data is then arranged by stacking the data of one cluster on top of another.In the second step,a selection algorithm is proposed to select data from each cluster to train the DRL agent.The frequency of selection from each cluster is in proportion to the number of data points in that cluster and therefore named the proportional selection method.To analyze the performance of the proposed approach and compare the results with the conventional random selection method,two indices are proposed in this study:the flatness index and the divergence index.The model is trained using different data sets(1-year,3-year,and 5-year)and also with the inclusion of solar photovoltaics.The simulation results confirmed the superior performance of the proposed approach to flatten the building’s load curve by optimally operating the energy storage system.
基金supported by the Natural Sciences and Engineering Research Council(NSERC)of Canada grant RGPIN-2024-04565by the NSERC/Alberta Innovates grant ALLRP 561116-20+5 种基金Part of this work has taken place in the Intelligent Robot Learning(IRL)Lab at the University of Alberta,which is supported in part by research grants from the Alberta Machine Intelligence Institute(Amii),Canadaa Canada CIFAR AI Chair,AmiiDigital Research Alliance of CanadaHuaweiMitacs,Canadaand NSERC,Canada.
文摘As the energy landscape evolves towards sustainability,the accelerating integration of distributed energy resources poses challenges to the operability and reliability of the electricity grid.One significant aspect of this issue is the notable increase in net load variability at the grid edge.Transactive energy,implemented through local energy markets,has recently garnered attention as a promising solution to address the grid challenges in the form of decentralized,indirect demand response on a community level.Model-free control approaches,such as deep reinforcement learning(DRL),show promise for the decentralized automation of participation within this context.Existing studies at the intersection of transactive energy and model-free control primarily focus on socioeconomic and self-consumption metrics,overlooking the crucial goal of reducing community-level net load variability.This study addresses this gap by training a set of deep reinforcement learning agents to automate end-user participation in an economy-driven,autonomous local energy market(ALEX).In this setting,agents do not share information and only prioritize individual bill optimization.The study unveils a clear correlation between bill reduction and reduced net load variability.The impact on net load variability is assessed over various time horizons using metrics such as ramping rate,daily and monthly load factor,as well as daily average and total peak export and import on an open-source dataset.To examine the performance of the proposed DRL method,its agents are benchmarked against a optimal near-dynamic programming method,using a no-control scenario as the baseline.The dynamic programming benchmark reduces average daily import,export,and peak demand by 22.05%,83.92%,and 24.09%,respectively.The RL agents demonstrate comparable or superior performance,with improvements of 21.93%,84.46%,and 27.02%on these metrics.This demonstrates that DRL can be effectively employed for such tasks,as they are inherently scalable with near-optimal performance in decentralized grid management.
文摘Local energy markets are emerging as a tool for coordinating generation, storage, and consumption of energyfrom distributed resources. In combination with automation, they promise to provide an effective energymanagement framework that is fair and brings system-level savings. The cooperative–competitive natureof energy markets calls for multi-agent based automation with learning energy trading agents. However,depending on the dynamics of the agent–environment interaction, this approach may yield unintended behaviorof market participants. Thus, the design of market mechanisms suitable for reinforcement learning agentsmust take into account this interplay. This article introduces autonomous local energy exchange (ALEX) asan experimental framework that combines multi-agent learning and double auction mechanism. Participantsdetermine their internal price signals and make energy management decisions through market interactions,rather than relying on predetermined external price signals. The main contribution of this article is examinationof compatibility between specific market elements and independent learning agents. Effects of different marketproperties are evaluated through simulation experiments, and the results are used for determine a suitablemarket design. The results show that market truthfulness maintains demand-response functionality, while weakbudget balancing provides a strong reinforcement signal for the learning agents. The resulting agent behavioris compared with two baselines: net billing and time-of-use rates. The ALEX-based pricing is more responsiveto fluctuations in the community net load compared to the time-of-use. The more accurate accounting ofrenewable energy usage reduced bills by a median 38.8% compared to net billing, confirming the ability tobetter facilitate demand response.