Kernel Methods for the Approximation of Nonlinear Systems
We introduce a data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction. The method rests on the assumption that the nonlinear system behaves linearly when lifted into a high (or infinite) dimensional feature space where balanced truncation may be carried out implicitly. This leads to a nonlinear reduction map which can be combined with a representation of the system belonging to a reproducing kernel Hilbert space to give a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model. Empirical simulations illustrating the approach are also provided.
💡 Research Summary
The paper proposes a novel data‑driven model reduction framework for nonlinear control systems by exploiting recent advances in machine learning, particularly kernel methods and statistical dimensionality reduction. The central premise is that a nonlinear system, when lifted into a high‑ or infinite‑dimensional reproducing kernel Hilbert space (RKHS), behaves approximately linearly, allowing the implicit application of linear balanced truncation techniques.
The authors first define controllability and observability “energies” for the nonlinear system as Gramian operators in the RKHS. These Gramians are estimated empirically from simulated or measured trajectories using impulse responses or white‑noise excitation. By simultaneously diagonalizing the empirical controllability and observability Gramians, they identify directions in the feature space that are both highly controllable and observable. This simultaneous diagonalization is mathematically equivalent to kernel principal component analysis (KPCA); the resulting eigenvalues are the Hankel singular values that rank the importance of each mode.
A reduction map is then constructed: the state is projected onto the leading eigenvectors, yielding a low‑dimensional coordinate (z = T^{\top}\Phi(x)), where (\Phi) is the implicit feature map associated with the chosen kernel (Gaussian, polynomial, etc.) and (T) contains the dominant eigenvectors. To obtain a closed reduced‑order dynamical system, the authors approximate the lifted dynamics and output map in the RKHS using kernel‑based regression (e.g., regularized least squares or Gaussian process regression). The resulting reduced model has the form
\
Comments & Academic Discussion
Loading comments...
Leave a Comment