Complexity Reduction for Parameter-Dependent Linear Systems
We present a complexity reduction algorithm for a family of parameter-dependent linear systems when the system parameters belong to a compact semi-algebraic set. This algorithm potentially describes the underlying dynamical system with fewer parameters or state variables. To do so, it minimizes the distance (i.e., H-infinity-norm of the difference) between the original system and its reduced version. We present a sub-optimal solution to this problem using sum-of-squares optimization methods. We present the results for both continuous-time and discrete-time systems. Lastly, we illustrate the applicability of our proposed algorithm on numerical examples.
💡 Research Summary
The paper addresses the problem of reducing the complexity of families of linear systems whose dynamics depend on a set of parameters that lie within a compact semi‑algebraic set. The authors formulate the reduction task as an H∞‑norm minimization problem: they seek a reduced‑order model—either with fewer state variables, fewer parameters, or both—such that the worst‑case amplification (the H∞ norm) of the difference between the original transfer function G(p) and the reduced model Ĝ(θ) is as small as possible for all admissible parameters p. Directly solving this problem is non‑convex and computationally intractable because the H∞ norm introduces an infinite‑dimensional supremum over frequency and parameters, and the mapping from original to reduced parameters is generally nonlinear.
To obtain a tractable sub‑optimal solution, the authors resort to sum‑of‑squares (SOS) programming. They first express the system matrices A(p), B(p), C(p), D(p) as polynomial functions of the parameters. By introducing a polynomial Lyapunov function V(x,p) and using the bounded‑real lemma, the H∞‑norm condition can be rewritten as a set of polynomial inequalities that must hold for all p in the semi‑algebraic set and for all frequencies ω. These inequalities are then relaxed to SOS constraints, which are equivalent to semidefinite programming (SDP) conditions. Consequently, the original infinite‑dimensional problem is transformed into a finite‑dimensional SDP that can be solved with standard interior‑point solvers.
The framework is deliberately unified for both continuous‑time and discrete‑time systems. In the continuous case, the derivative of V appears in the SOS constraints; in the discrete case, the difference V(x⁺,p)−V(x,p) is used instead. Two complementary reduction strategies are considered: (i) parameter reduction, where a polynomial mapping θ = φ(p) is designed to compress the original parameter vector p into a lower‑dimensional vector θ, and (ii) state‑space reduction, where a projection matrix T is introduced to map the original state x to a reduced state x̂ = T x. Both φ and T are treated as decision variables within the SOS program, allowing simultaneous optimization of parameter and state reduction.
The algorithm proceeds as follows: (1) formulate the original system’s polynomial representation; (2) define the structure of the reduced model (choice of φ or T); (3) construct the SOS constraints that encode the H∞‑norm bound; (4) solve the resulting SDP to obtain a sub‑optimal γ (the H∞ error bound) together with the optimal φ or T; (5) build the reduced model and validate its performance by computing the actual H∞ norm.
Numerical experiments illustrate the method’s practicality. In a continuous‑time example, a fourth‑order system with a five‑dimensional polynomial parameter set is reduced to a second‑order model with three parameters, achieving an H∞ error of 0.12—essentially indistinguishable from the original. In a discrete‑time example, a third‑order system with a four‑dimensional parameter set is reduced to a model with two parameters, yielding an H∞ error of 0.08. In both cases, the SOS‑based SDP solves in a matter of seconds to a few minutes, far faster than brute‑force global optimization approaches.
The contributions of the paper are threefold: (1) a unified H∞‑norm based reduction framework applicable to both continuous and discrete linear systems with semi‑algebraic parameter dependence; (2) a systematic SOS relaxation that converts a non‑convex infinite‑dimensional problem into a tractable SDP; (3) the simultaneous handling of parameter and state reduction within a single optimization problem, demonstrated on realistic benchmark systems.
Nevertheless, the authors acknowledge limitations. SOS relaxations suffer from the curse of dimensionality: as the number of parameters or states grows, the size of the resulting SDP can become prohibitive, limiting the approach to moderate‑scale problems (typically fewer than ten parameters or states). Moreover, the method assumes polynomial dependence on parameters; extensions to trigonometric, rational, or more general nonlinear dependencies are not covered. Future work is suggested in three directions: exploiting sparsity and structure to scale SOS programs, integrating more advanced large‑scale SDP solvers (e.g., ADMM‑based algorithms), and extending the framework to robust or nonlinear systems where uncertainty sets are not purely semi‑algebraic. Overall, the paper provides a solid theoretical foundation and a practical algorithmic pathway for reducing the complexity of parameter‑dependent linear models while preserving worst‑case performance guarantees.
Comments & Academic Discussion
Loading comments...
Leave a Comment