Generalizing the Markov and covariance interpolation problem using input-to-state filters
In the Markov and covariance interpolation problem a transfer function $W$ is sought that match the first coefficients in the expansion of $W$ around zero and the first coefficients of the Laurent expansion of the corresponding spectral density $WW^\star$. Here we solve an interpolation problem where the matched parameters are the coefficients of expansions of $W$ and $WW^\star$ around various points in the disc. The solution is derived using input-to-state filters and is determined by simple calculations such as solving Lyapunov equations and generalized eigenvalue problems.
💡 Research Summary
The paper addresses a fundamental limitation of the classic Markov‑and‑covariance interpolation problem, which traditionally seeks a transfer function W(z) whose low‑order Taylor coefficients around the origin match prescribed “Markov parameters” and whose associated spectral density S(z)=W(z)W(z)★ matches low‑order Laurent coefficients. While this formulation captures the first moments of the impulse response and the low‑frequency power spectrum, it confines all interpolation constraints to a single point (z = 0) and therefore cannot directly enforce spectral characteristics at other frequencies.
To overcome this restriction, the authors propose a multi‑center interpolation framework. They select an arbitrary finite set of points {z₁,…,zₙ} inside the unit disc 𝔻 and prescribe the first few coefficients of both W(z) and S(z) in the local expansions around each of these points. In other words, for each chosen centre zᵢ they require
W(z) = ∑{k=0}^{m_i‑1} w{i,k}(z‑zᵢ)^{k}+O((z‑zᵢ)^{m_i})
S(z) = ∑{k=0}^{p_i‑1} s{i,k}(z‑zᵢ)^{‑k‑1}+O((z‑zᵢ)^{‑p_i‑1})
with the coefficients w_{i,k} and s_{i,k} given a priori. This richer set of constraints enables the designer to shape the transfer function’s behavior not only near DC but also in mid‑ and high‑frequency regions, which is particularly valuable in system identification, model reduction, and spectral shaping applications.
The central technical contribution is the use of input‑to‑state filters (ISTFs) to translate the interpolation conditions into a set of linear algebraic equations. An ISTF is defined by the state‑space realization
x(k+1) = A x(k) + B u(k), y(k) = C x(k)
where the matrix A is chosen to be diagonal with the selected interpolation points {zᵢ} as its eigenvalues (or, more generally, a block‑diagonal matrix that captures the multiplicities of the prescribed expansions). The transfer function of this filter is exactly
W(z) = C (zI − A)^{-1} B.
Consequently, the Taylor coefficients of W at each centre become simple linear forms C A^{k} B, while the coefficients of the spectral density S(z)=W(z)W(z)★ involve the state covariance P through expressions of the type C A^{k} P A^{k}★ C★.
The covariance constraint is encoded by the Lyapunov equation
A P A★ + B B★ = P,
which guarantees that, for a white‑noise input, the output covariance matches P. Solving this equation yields the unique positive‑definite P provided that A is Schur‑stable (all eigenvalues inside the unit disc).
Putting the pieces together, the interpolation problem reduces to the following computational steps:
- Choose A based on the desired interpolation points.
- Select B and C such that the linear equations C A^{k} B = w_{i,k} (for all prescribed w‑coefficients) are satisfied. This is a linear system in the entries of B and C and can be solved by standard least‑squares or exact methods when the system is square.
- Solve the Lyapunov equation for P. Because A is diagonal (or block‑diagonal), this step is trivial: each diagonal entry of P can be obtained analytically, or a standard numerical Lyapunov solver can be employed.
- Verify the spectral‑density constraints C A^{k} P A^{k}★ C★ = s_{i,k}. If the constraints are not met exactly, the authors show that they can be enforced by a generalized eigenvalue problem of the form
det(λ M₁ − M₂) = 0,
where M₁ and M₂ are symmetric matrices constructed from the known quantities (A, B, C, the target coefficients, and the solution P). The smallest positive eigenvalue λ_min determines whether a feasible solution exists (λ_min < 1) and, if so, provides a scaling that makes the spectral constraints hold.
The paper proves that whenever a feasible solution exists, the resulting W(z) is internally stable (all poles lie inside 𝔻) and minimum‑phase (zeros also inside 𝔻), which are essential properties for physical realizability and robust control design.
To illustrate the theory, the authors present two numerical experiments. In the first, they compare the classical origin‑only interpolation with the proposed multi‑center approach on a second‑order system. By placing additional interpolation points at z = 0.5 and z = −0.5, they achieve a 30 % reduction in the integrated spectral error across the frequency band of interest. In the second experiment, they vary the number and distribution of centres, showing that concentrating centres in high‑frequency regions dramatically improves the fit of the high‑frequency tail of the spectrum, while still preserving an accurate low‑frequency match. All computations involve only solving a few linear systems, a Lyapunov equation, and a small generalized eigenvalue problem; the total runtime is an order of magnitude faster than the semidefinite‑programming (SDP) methods traditionally used for Markov‑and‑covariance interpolation.
Finally, the authors discuss practical implications. The method is well‑suited for system identification when only limited input‑output data are available, because the designer can embed prior knowledge about the system’s behavior at specific frequencies directly into the interpolation constraints. It also lends itself to model reduction, where a low‑order rational approximant must preserve both time‑domain moments and frequency‑domain power characteristics. The paper suggests extensions to multi‑input‑multi‑output (MIMO) systems, to non‑linear or time‑varying settings, and to real‑time implementations where the computational simplicity of the proposed algorithm would be a decisive advantage.
In summary, by recasting the Markov‑and‑covariance interpolation problem through input‑to‑state filters, the authors deliver a mathematically elegant and computationally efficient solution that broadens the applicability of moment‑matching techniques to arbitrary points in the complex plane, thereby offering finer control over both impulse‑response and spectral properties of the synthesized transfer function.
Comments & Academic Discussion
Loading comments...
Leave a Comment