Lower Bounds on Mutual Information

We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov { it et al.}, Phys. Rev. E { bf 69}, 066138 (2004). We show that non-trivial low

Lower Bounds on Mutual Information

We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf 69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametrizations, because linear correlations are not invariant under them. The simplest bounds are obtained for Gaussians, but the most interesting ones for practical purposes are obtained for uniform marginal distributions. The latter can be enforced in general by using the ranks of the individual variables instead of their actual values, in which case one obtains bounds on MI in terms of Spearman correlation coefficients. We show with gene expression data that these bounds are in general non-trivial, and the degree of their (non-)saturation yields valuable insight.


💡 Research Summary

The paper revisits the problem of establishing lower bounds on the mutual information (MI) between two continuous random variables, a topic originally addressed by Kraskov et al. (Phys. Rev. E 69, 066138, 2004). Kraskov and colleagues claimed that a simple expression involving the Pearson correlation coefficient ρ, namely
\


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...