DUST: A Framework for Data-Driven Density Steering
We consider the problem of data-driven stochastic optimal control of an unknown LTI dynamical system. Assuming the process noise is normally distributed, we pose the problem of steering the state’s mean and covariance to a target normal distribution, under noisy data collected from the underlying system, a problem commonly referred to as covariance steering (CS). A novel framework for Data-driven Uncertainty quantification and density STeering (DUST) is presented that simultaneously characterizes the noise affecting the measured data and designs an optimal affine-feedback controller to steer the density of the state to a prescribed terminal value. We use both indirect and direct data-driven design approaches based on the notions of persistency of excitation and subspace identification to exactly represent the mean and covariance dynamics of the state in terms of the data and noise realizations. Since both the mean and the covariance steering sub-problems are plagued with stochastic uncertainty arising from noisy data collection, we first estimate the noise realization from this dataset and subsequently compute tractable upper bounds on the estimation errors. The first and second moment steering problems are then solved to optimality using techniques from robust control and robust optimization. Lastly, we present an alternative control design approach based on the certainty equivalence principle and interpret the problem as one of CS under multiplicative uncertainty. We analyze the performance and efficacy of each of these data-driven approaches using a case study and compare them with their model-based counterparts.
💡 Research Summary
The paper introduces a comprehensive data‑driven framework called DUST (Data‑driven Uncertainty quantification and density ST eering) for steering the probability distribution of an unknown stochastic linear time‑invariant (LTI) system to a prescribed Gaussian target using only offline noisy input‑output data. The authors consider a discrete‑time system xₖ₊₁ = A xₖ + B uₖ + D wₖ where wₖ is i.i.d. Gaussian, while the matrices A, B, D are unknown. The goal is to drive the state from an initial Gaussian distribution N(μᵢ, Σᵢ) to a terminal Gaussian N(μ_f, Σ_f) over a finite horizon while minimizing a quadratic cost that separates into a mean‑related part and a covariance‑related part.
To achieve this, the paper builds on two pillars of behavioral system theory: persistency of excitation (the Fundamental Lemma) and subspace identification. First, using persistently exciting data, the authors parameterize the unknown noise realizations that generated the dataset, thereby obtaining an exact representation of the mean dynamics in terms of data and unknown noise. This leads to an uncertain quadratic program for the mean steering sub‑problem (DD‑MS). Second, they employ subspace identification to directly express the feedback gains that affect the covariance dynamics, allowing the use of existing model‑based covariance steering (CS) theory to formulate a robust semidefinite program (SDP) for the covariance steering sub‑problem (DD‑CS).
A crucial contribution is the systematic estimation of the process noise from the same dataset. The authors combine maximum‑likelihood estimation (MLE) with neural‑network regression to obtain a point estimate of the noise trajectory and its covariance. High‑confidence error bounds are derived using chi‑square quantiles, yielding two uncertainty sets: Δ_model for the mean problem and Δ_noise for the covariance problem. These sets are incorporated as robust constraints, guaranteeing that the designed controller satisfies the terminal distribution constraints with a prescribed probability.
Beyond the indirect (certainty‑equivalence) approach, the paper proposes a parametric‑uncertainty formulation (PU‑DD‑DS) that interprets the problem as control under multiplicative uncertainty. By applying convex relaxations and polynomial approximations, the authors obtain a tractable convex program that solves the original stochastic density‑steering task without the need for explicit noise reconstruction.
The framework is validated on a second‑order system under varying noise levels, data lengths, and excitation conditions. Numerical experiments compare the three data‑driven designs (DD‑MS, DD‑CS, PU‑DD‑DS) against model‑based optimal controllers (LQR/MPC). Results show that DUST achieves performance comparable to model‑based methods while exhibiting superior robustness when model identification errors are large. The paper also discusses computational aspects, showing that all sub‑problems can be solved with off‑the‑shelf solvers (e.g., SDP solvers) in polynomial time.
In summary, DUST provides a unified pipeline: (1) collect persistently exciting noisy data, (2) estimate the underlying noise and construct high‑confidence uncertainty sets, (3) formulate robust mean and covariance steering problems, and (4) solve them via convex optimization. This pipeline bridges the gap between data‑driven control and stochastic optimal control, enabling reliable distribution steering in practical settings where system models are unavailable and measurements are corrupted by noise. Future work suggested includes extensions to non‑Gaussian disturbances, nonlinear dynamics, and online adaptive implementations.
Comments & Academic Discussion
Loading comments...
Leave a Comment