On a Connection between Entropy, Extensive Measurement and Memoryless Characterization
We define an entropy based on a chosen governing probability distribution. If a certain kind of measurements follow such a distribution it also gives us a suitable scale to study it with. This scale will appear as a link function that is applied to the measurements. A link function can also be used to define an alternative structure on a set. We will see that generalized entropies are equivalent to using a different scale for the phenomenon that is studied compared to the scale the measurements arrive on. An extensive measurement scale is here a scale for which measurements fulfill a memoryless property. We conclude that the alternative algebraic structure defined by the link function must be used if we continue to work on the original scale. We derive Tsallis entropy by using a generalized log-logistic governing distribution. Typical applications of Tsallis entropy are related to phenomena with power-law behaviour.
💡 Research Summary
The paper proposes a unified framework that links entropy, measurement scales, and the memory‑less property of statistical distributions. It begins by introducing the notion of a “governing probability distribution,” a prior assumption about the statistical law that a set of observations follows. From this distribution the cumulative distribution function (CDF) is taken, and its inverse is defined as a link function g(x). The link function maps the original measurement x to a transformed variable y = g(x). This transformation is constructed so that the transformed measurements satisfy a memory‑less property: the sum (or composition) of two independent observations retains the same distributional form. The authors call a scale that endows measurements with this property an “extensive measurement scale.”
Next, the paper shows that a generalized entropy defined with respect to the governing distribution is mathematically equivalent to applying the ordinary Shannon entropy after the link‑function transformation. Specifically, if p(x) is the probability density of the data and f(x) is the governing density, the entropy
H_f = −∫ p(x) log_f p(x) dx
uses a logarithm that is the inverse of the CDF of f. By substituting y = g(x) the expression becomes the standard Shannon form in the y‑space. Consequently, “using a different entropy” is tantamount to “working on a different scale.” The new scale carries its own algebraic structure: addition, multiplication, and other operations are re‑defined through the link function, leading to non‑linear composition rules that preserve the memory‑less character.
The authors then specialize to a generalized log‑logistic distribution, whose CDF has the form
F(x) = 1 /
Comments & Academic Discussion
Loading comments...
Leave a Comment