The Rényi entropy is a spectrum of generalizations to the Shannon Entropy:
In : In : from dit.other import renyi_entropy
For several values of \(\alpha\), the Rényi entropy takes on particular values.
\(\alpha = 0\)¶
When \(\alpha = 0\) the Rényi entropy becomes what is known as the Hartley entropy:
In : In : renyi_entropy(d, 0) Out: 2.0 In : Out: 4.0
\(\alpha = 1\)¶
When \(\alpha = 1\) the Rényi entropy becomes the standard Shannon entropy:
In : In : renyi_entropy(d, 1) Out: 2.0 In : Out: 2.9688513169509623
\(\alpha = 2\)¶
When \(\alpha = 2\), the Rényi entropy becomes what is known as the collision entropy:
where \(Y\) is an IID copy of X. This is basically the surprisal of “rolling doubles”
In : In : renyi_entropy(d, 2) Out: 2.0 In : Out: 2.7607270851693615
\(\alpha = \infty\)¶
Finally, when \(\alpha = \infty\) the Rényi entropy picks out the probability of the most-probable event:
In : In : renyi_entropy(d, np.inf) Out: 2.0 In : Out: 2.275104563096674
In general, the Rényi entropy is a monotonically decreasing function in \(\alpha\):
Further, the following inequality holds in the other direction:
renyi_entropy(dist, order, rvs=None, rv_mode=None)¶
Compute the Renyi entropy of order order.
dist (Distribution) – The distribution to take the Renyi entropy of.
order (float >= 0) – The order of the Renyi entropy.
H_a – The Renyi entropy.
- Return type
ditException – Raised if rvs or crvs contain non-existant random variables.
ValueError – Raised if order is not a non-negative float.