Time and spectral domain relative entropy: A new approach to multivariate spectral estimation

Time and spectral domain relative entropy: A new approach to   multivariate spectral estimation
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The concept of spectral relative entropy rate is introduced for jointly stationary Gaussian processes. Using classical information-theoretic results, we establish a remarkable connection between time and spectral domain relative entropy rates. This naturally leads to a new spectral estimation technique where a multivariate version of the Itakura-Saito distance is employed}. It may be viewed as an extension of the approach, called THREE, introduced by Byrnes, Georgiou and Lindquist in 2000 which, in turn, followed in the footsteps of the Burg-Jaynes Maximum Entropy Method. Spectral estimation is here recast in the form of a constrained spectrum approximation problem where the distance is equal to the processes relative entropy rate. The corresponding solution entails a complexity upper bound which improves on the one so far available in the multichannel framework. Indeed, it is equal to the one featured by THREE in the scalar case. The solution is computed via a globally convergent matricial Newton-type algorithm. Simulations suggest the effectiveness of the new technique in tackling multivariate spectral estimation tasks, especially in the case of short data records.


💡 Research Summary

**
The paper introduces a novel multivariate spectral estimation technique grounded in information theory. It begins by recalling the definitions of differential entropy and relative entropy (Kullback‑Leibler divergence) for Gaussian random vectors, and then presents Kolmogorov‑Szegő’s formula that expresses the entropy rate of a stationary Gaussian process as an integral of the logarithm of its spectral density. Building on classical results by Pinsker, Van den Bos, and Stoorvogel‑Van Schuppen, the authors prove that the time‑domain relative entropy rate between two jointly Gaussian processes equals the spectral‑domain expression involving the log‑determinant and trace terms. This “time‑spectral symmetry” provides a rigorous justification for using the relative entropy rate as a distance measure between spectra.

In the multivariate setting, the problem is to estimate a positive‑definite spectral density Φ∈S⁺{m×m} from a finite data record {y_i}{i=1}^N. A bank of rational filters with transfer function G(z)=(zI−A)^{-1}B (A stable, (A,B) reachable) processes the data, yielding a steady‑state covariance Σ of the filter state. The estimation must satisfy the linear interpolation constraint

∫_{−π}^{π} G(e^{jθ}) Φ(e^{jθ}) G(e^{jθ})* dθ = Σ.

A prior spectral density Ψ is also incorporated to embed a priori knowledge. Existing multivariate extensions of the THREE method either rely on a Kullback‑Leibler pseudo‑distance (which only yields rational solutions when Ψ is scalar) or on a Hellinger‑type distance (which leads to higher McMillan degree).

The authors propose a new pseudo‑distance derived directly from the relative entropy rate:

d_RER(Φ,Ψ) = (1/4π)∫_{−π}^{π}


Comments & Academic Discussion

Loading comments...

Leave a Comment