Electrified heating systems with thermal storage, such as electric boilers and heat pumps, represent a major source of demand-side flexibility. Under current electricity market designs, balance responsible parties (BRPs) operating such assets are required to submit binding day-ahead electricity consumption schedules, and they typically do it based on forecasts of heat demand and electricity prices. Common scheduling approaches implicitly assume that forecast uncertainty can be well characterized using historical forecast errors. In practice, however, the cumulative effect of uncertainty creates significant exposure to imbalance-price risk when the committed schedule cannot be followed. To address this, we propose a distributionally robust chance-constrained optimization framework for the day-ahead scheduling of a multi-MW electric boiler using only limited residual forecast samples. We derive a tractable convex reformulation of the problem and calibrate the ambiguity set directly from historical forecast-error data through an a priori tunable risk parameter. Numerical results show that enforcing performance guarantees on the heat-demand balance constraint reduces demand violations by 40% compared to a deterministic forecast-based scheduler and up to 10% relative to a nominal chance-constrained model with a fixed error distribution. Further, we show that modeling the real-time rebound cost of demand violations as a second-stage term can reduce the overall daily operating cost by up to 34% by hedging against highly volatile day-ahead electricity prices.
The ongoing green electrification is shifting energy production from centralized, fossil fuel-based generation to dispersed renewable electricity sources. Variability in renewable generation increases electricity price volatility, which incentivizes industrial assets to use their flexibility to reduce operating costs, while contributing to grid stability. In this context, electrified heating systems with thermal storage, such as electric boilers and heat pumps, hold a valuable demand-side flexibility potential because they can shift electricity consumption over time while still meeting heat demand [1], [2], [3].
In current electricity market designs, the balance responsible party (BRP) associated with an industrial load such as an electric boiler, a heat pump, or a portfolio of loads and generators submits a day ahead energy schedule to the relevant system operator. The BRP may subsequently rebalance this position through intraday trading or balancing and ancillary services as forecasts and operating conditions change. Any remaining deviations between the nominated schedule and metered consumption or production are settled at imbalance prices, which can be highly volatile [4], [5].
Day-ahead scheduling is therefore highly dependent on the quality of heat-demand and electricity-price forecasts, which has fueled extensive research on short-term heat-demand prediction, ranging from autoregressive methods to multi-step machine-learning models [6], [7], [8]. Although these models often achieve strong average accuracy, forecast uncertainty persists, particularly during demand peaks and atypical operating conditions. Consequently, relying on point forecasts alone can still lead to unacceptably high exposure to unfulfilled load demand and associated imbalance cost.
Imbalance cost risk mitigation is commonly achieved by purchasing the baseline electricity consumption in the dayahead market that fully covers expected heat demand. Any residual mismatch between scheduled and realized heat demand, which is not balanced out in intraday or ancillary markets, is then covered by activating backup generation units, such as gas-fired boilers. While effective for ensuring coverage of heat demand, this strategy often increases operating costs and carbon emissions, and reduces the ability to capitalize on periods of low-cost, low-carbon renewable electricity.
An alternative approach is to incorporate forecast uncertainty explicitly during the planning optimization stage. In the literature, these problems are commonly framed as stochastic [9], [10], [11] or robust optimization [12], [13]. Robust optimization enforces feasibility for all realizations within a prescribed uncertainty set, which can lead to conservative bid strategies, particularly in volatile markets. Alternatively, stochastic optimization enforces feasibility through suitable risk measures, such as in chance-constrained problems (CCPs) [14], [15], [16]. In CCPs, hard constraints are replaced with probabilistic counterparts, allowing feasibility at a level of 1α, where α represents the tolerated risk across a historical set of scenarios.
Exemplifying its application in the literature, [17] proposes a chance-constrained stochastic model predictive controller to lower operational expenses under uncertain thermal and electricity demands using Gaussian processes. Similarly, [18] develops a robust chance-constrained scheduling model that addresses uncertainties in both generation and consumption, reformulating CCPs as mixed-integer linear programming (MILP). Other applications in risk management for CCPs include [19], which proposes a CVaR-based linearized chanceconstrained two-stage optimization, or [15], which studies the dynamic dispatch of electric boilers and the curtailment of renewable energy sources (RES) to address load and renewable uncertainty.
Finally, applications where constraint violations result in direct economic penalties, such as the activation of backup generators, are formulated within a two-stage stochastic programming framework. While the feasibility of the balance constraint is ensured through a chance constraint, the secondstage objective accounts for the expected cost arising from uncertainty in the forecast error realization.
However, the scheduling approaches above typically assume that the underlying data-generating distribution can be accurately inferred from historical data, thereby neglecting the risk induced by model misspecification and distribution shifts. In contrast, recently-proposed distributionally robust optimization (DRO) framework provides a principled framework to handle distributional uncertainty [20]. DRO hedges against such uncertainty by optimizing the worst-case expected realization over an ambiguity set of probability distributions defined as a so-called Wasserstein ball of prescribed radius centered at the empirical distribution.
DRO guarantees that the decision remains reliable in cases of misspecification or sh
This content is AI-processed based on open access ArXiv data.