Balanced Reduction of Nonlinear Control Systems in Reproducing Kernel Hilbert Space
We introduce a novel data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction. The method rests on the assumption that the nonlinear system behaves linearly when lifted into a high (or infinite) dimensional feature space where balanced truncation may be carried out implicitly. This leads to a nonlinear reduction map which can be combined with a representation of the system belonging to a reproducing kernel Hilbert space to give a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model. Empirical simulations illustrating the approach are also provided.
💡 Research Summary
The paper presents a data‑driven methodology for reducing the order of nonlinear control systems by exploiting recent advances in machine learning, particularly kernel methods and statistical dimensionality reduction. The central premise is that a nonlinear system, when lifted into a high‑ or infinite‑dimensional feature space defined by a reproducing kernel Hilbert space (RKHS), exhibits approximately linear dynamics. In that lifted space, classic linear model reduction techniques—most notably balanced truncation—can be applied implicitly without ever constructing the explicit high‑dimensional state vectors.
The authors begin by formalizing the nonlinear system as ẋ = f(x) + g(x)u, y = h(x) with state x∈ℝⁿ, input u∈ℝᵐ, and output y∈ℝᵖ. A feature map φ:ℝⁿ→ℋ associated with a chosen kernel k(·,·) is introduced, allowing inner products in ℋ to be computed via kernel evaluations. The key assumption is that the lifted dynamics can be approximated by a linear relation ẋ̃ = Ãφ(x) + B̃u. Under this assumption, the input‑energy Gramian W_c and output‑energy Gramian W_o of the linearized lifted system are estimated directly from data using kernel‑based covariance matrices K_c and K_o. Solving the generalized eigenvalue problem K_c v = σ² K_o v yields a set of singular values σ_i (the “Hankel” or “balancing” values) and corresponding eigenvectors v_i that rank the importance of each mode in the feature space.
A reduction map ψ:ℝⁿ→ℝʳ (with r≪n) is then defined by projecting the lifted state onto the subspace spanned by the r dominant eigenvectors: ψ(x) =
Comments & Academic Discussion
Loading comments...
Leave a Comment