On Renyi and Tsallis entropies and divergences for exponential families

On Renyi and Tsallis entropies and divergences for exponential   families
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Many common probability distributions in statistics like the Gaussian, multinomial, Beta or Gamma distributions can be studied under the unified framework of exponential families. In this paper, we prove that both R'enyi and Tsallis divergences of distributions belonging to the same exponential family admit a generic closed form expression. Furthermore, we show that R'enyi and Tsallis entropies can also be calculated in closed-form for sub-families including the Gaussian or exponential distributions, among others.


💡 Research Summary

The paper investigates Rényi and Tsallis information measures—both entropies and divergences—within the broad class of exponential families. After recalling that an exponential family can be written as
(p_{\theta}(x)=\exp{\langle t(x),\theta\rangle -F(\theta)+k(x)}),
where (t(x)) is the sufficient statistic, (\theta) the natural parameter, (F(\theta)) the log‑normalizer (cumulant generating function), and (k(x)) the base measure, the authors exploit the convexity of (F) to obtain closed‑form expressions for the Rényi divergence (D_{\alpha}^{R}) and the Tsallis divergence (D_{\alpha}^{T}) between any two members (p_{\theta_{1}}) and (p_{\theta_{2}}) of the same family. The key result is

\


Comments & Academic Discussion

Loading comments...

Leave a Comment