Dispersion of Gaussian Sources with Memory and an Extension to Abstract Sources
We consider finite blocklength lossy compression of information sources whose components are independent but non-identically distributed. Crucially, Gaussian sources with memory and quadratic distortion can be cast in this form. We show that under the operational constraint of exceeding distortion $d$ with probability at most $ε$, the minimum achievable rate at blocklength $n$ satisfies $R(n, d, ε)=\mathbb{R}_n(d)+\sqrt{\frac{\mathbb{V}_n(d)}{n}}Q^{-1}(ε)+O \left(\frac{\log n}{n}\right)$, where $Q^{-1}(\cdot)$ is the inverse $Q$-function, while $\mathbb{R}_n(d)$ and $\mathbb{V}_n(d)$ are fundamental characteristics of the source computed using its $n$-letter joint distribution and the distortion measure, called the $n$th-order informational rate-distortion function and the source dispersion, respectively. Our result generalizes the existing dispersion result for abstract sources with i.i.d. components. It also sharpens and extends the only known dispersion result for a source with memory, namely, the scalar Gauss-Markov source. The key novel technical tool in our analysis is the point-mass product proxy measure, which enables the construction of typical sets. This proxy generalizes the empirical distribution beyond the i.i.d. setting by preserving additivity across coordinates and facilitating a typicality analysis for sums of independent, non-identical terms.
💡 Research Summary
**
The paper studies finite‑blocklength lossy compression for sources whose components are independent but not identically distributed (i.n.i.d.). This model captures Gaussian sources with memory, because any Gaussian vector can be orthogonally transformed into independent scalar components while preserving Euclidean distortion.
The authors define the n‑th‑order informational rate‑distortion function (RDF)
\
Comments & Academic Discussion
Loading comments...
Leave a Comment