Estimating the Shannon Entropy Using the Pitman--Yor Process
The Shannon entropy is a fundamental measure for quantifying diversity and model complexity in fields such as information theory, ecology, and genetics. However, many existing studies assume that the number of species is known, an assumption that is often unrealistic in practice. In recent years, efforts have been made to relax this restriction. Motivated by these developments, this study proposes an entropy estimation method based on the Pitman–Yor process, a representative approach in Bayesian nonparametrics. By approximating the true distribution as an infinite-dimensional process, the proposed method enables stable estimation even when the number of observed species is smaller than the true number of species. This approach provides a principled way to deal with the uncertainty in species diversity and enhances the reliability and robustness of entropy-based diversity assessment. In addition, we investigate the convergence property of the Shannon entropy for regularly varying distributions and use this result to establish the consistency of the proposed estimator. Finally, we demonstrate the effectiveness of the proposed method through numerical experiments.
💡 Research Summary
The paper tackles the long‑standing problem of estimating the Shannon entropy H(p)=−∑p_i log p_i when both the true number of categories K and the underlying probability vector p are unknown, and especially when the sample size N is smaller than K. Classical estimators such as the plug‑in maximum‑likelihood, Miller–Madow, and Chao–Shen assume either that K is known or that N≫K, and they perform poorly in the “small‑sample, large‑alphabet” regime because they cannot account for unseen species.
To overcome this limitation, the authors adopt a Bayesian non‑parametric framework based on the Pitman–Yor process (PYP). The PYP is parameterized by a discount d∈
Comments & Academic Discussion
Loading comments...
Leave a Comment