A family of statistical symmetric divergences based on Jensens inequality

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen's inequality for a convex functional generator. In particular, this family unifies the celebrated Je

A family of statistical symmetric divergences based on Jensens   inequality

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average divergence. This yields a smooth family of centroids linking the Jeffreys to the Jensen-Shannon centroid. Finally, we report on our experimental results.


💡 Research Summary

The paper introduces a novel parametric family of symmetric information‑theoretic divergences derived from Jensen’s inequality applied to a convex generator function φ. By defining the symmetric distance Dφ(p,q)=φ(p)+φ(q)−2φ((p+q)/2), the authors obtain a general framework that encompasses many known divergences. When φ is chosen as the Shannon entropy H, Dφ coincides with the Jensen‑Shannon divergence (JSD); when φ is a logarithmic linear functional, Dφ reduces to the Jeffreys divergence (JD). To bridge these two classic measures, the authors propose a convex combination φ_α = α·H + (1−α)·ψ, where ψ is any convex generator, and α∈


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...