Talk:Rényi entropy

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Why α=1 is special[edit]

Section doesn't feel like it's quite there yet. Maybe consider what happens when you're using a Renyi entropy as a measure of ecological diversity, and then realise that you need to split one species into two... -- Jheald 22:19, 23 January 2006 (UTC).[reply]

The first thing I saw was 1/(1-1) = inf Full Decent (talk) 16:07, 11 December 2009 (UTC)[reply]

That bothered me too until I realized that when α = 1 the sum of all probabilities is 1 and log(1) = 0. Hence one wants to know how 0 times infinity is approached as α approaches 1.
For the case of all probabilities equal, this is answered by the fourth sentence of the definition, starting "If the probabilities are ...". It is easily seen that H_α(X) = log(n) independently of α (and independently of the base). So in that case the limit as α → 1 remains log(n). Vaughan Pratt (talk) 11:12, 18 October 2020 (UTC)[reply]

To add[edit]

Renyi entropy was defined axiomatically in Renyi's Berkeley entropy paper. In this, a weakening of one of the Shannon axioms results in Renyi entropy; that's why α=1 is special. Also, some of Renyi entropy's applications - Statistical physics, General statistics, Machine learning, Signal processing, Cryptography (a measure of randomness, robustness), Shannon theory (generalizing, proving theorems), Source coding - should be added with context. I don't have all this handy right now, but I'm sure each piece of this is familiar to at least one person reading this page.... Calbaer 05:59, 5 May 2006 (UTC)[reply]

Also the continuous case is missing. -- Zz 11:58, 24 October 2006 (UTC)[reply]

non-decreasing?[edit]

The statement that is non-decreasing in seems to contradict the statement that . Also, should that be a weak inequality? LachlanA 23:24, 21 November 2006 (UTC)[reply]

Mathworld says they're non-decreasing. I think that's an error; it probably depends on whether you're using positive or negative entropies. I've fixed the inequalities, an obvious example of is . ⇌Elektron 18:58, 29 June 2012 (UTC)[reply]

Does the Heisenberg model need a mention in the intro?[edit]

I would suggest removing the sentence "In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group.[2][3]" from the beginning of the article. This is quite technical, and in no way an important fact about Rényi entropy, but rather about the Heisenberg model. It could be moved elsewhere in the article if so desired, although I personally don't see this as necessary. — Preceding unsigned comment added by 193.190.84.1 (talkcontribs) 10:32, 26 January 2018 (UTC)[reply]

Makes sense. Go for it! Jheald (talk) 10:36, 26 January 2018 (UTC)[reply]