Entropy calculation at base 10

5 ビュー (過去 30 日間)
Julian M
Julian M 2020 年 8 月 26 日
編集済み: Julian M 2020 年 8 月 31 日
I use pentropy function to calculate entropy of a discrete signal.
The function calculates the entropy at base 2 (Shannon information theory).
Is there a way to calculate it at base 10 rather than 2?

採用された回答

John D'Errico
John D'Errico 2020 年 8 月 26 日
編集済み: John D'Errico 2020 年 8 月 26 日
Um, think about what entropy means, at least in a mathematical form. It is just a logarithm.
So all you are asking is to compute a log to some other base.
To convert between different bases of a log is simple. Thus if we define a TWO parameter log function as:
log(x,B)
as the log of x, to base B, we can easily convert between bases, or more simple, relate the log for any base to the natural log. Here, I'll use log(x) as the natural log of x, some prefer ln as the natual log, but to be consistent with MATLAB notation, just use log. The basic formula is:
log(x,B) = log(x)/log(B)
Again, that one parameter log is just the natural log. This also means if we want to convert from one logarithmic base to another, you have a simple formula, since we would have
log(x) = log(x,B)*log(B) = log(x,A)*log(A)
And that gives us the base change formula directly.
log(x,A) = log(x,B)*log(B)/log(A)
We can test this easily enough in MATLAB, in case you don't believe me. MATLAB provides the functinos long2 and log10. That is, I can compute the log(3), to the base 2. From that, can I now convert it to the log(3) to base 10?
log2(3)
ans =
1.58496250072116
Just use the formula I showed:
log2(3)*log(2)/log(10)
ans =
0.477121254719662
log10(3)
ans =
0.477121254719662
As you should see, both give the same result.
So if you wish to convert an entropy computation from one base to another, multiply by the ratio of the natural logs of the two bases.
  1 件のコメント
Julian M
Julian M 2020 年 8 月 29 日
編集済み: Julian M 2020 年 8 月 31 日
This does not seem to be correct since the probability distribution is not equal among the variables.
Let's say x variable has five possible values x1 x2 x3 x4 x5 with probability P(1) P(2) P(3) P(4) P(5).
Entropy is equal to:
H = - Sigma (m=1 to 5) P(m) log P(m) at base 2 = - [ P(1) log P(1) + P(2) log P(2) + P(3) log P(3) + P(4) log P(4) + P(5) log P(5) ]
In general the probability distribution is not equal among the variables. Therefore, P(1) P(2) P(3) P(4) P(5) are not equal.
So I do not think this is a log at some other base matter.
Is there a way to check how the matlab uses pentropy function to calculate spectral entropy?
PS: You were right! No matter what the papability is, entropy at base 10 can be calculated from the entropy at base 2 by multiplying by the ratio of the natural logs of the two base. Thank you!

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeDescriptive Statistics についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by