version 1.0.0.0 (4.25 KB) by
Mo Chen

Functions for Information theory, such as entropy, mutual information, KL divergence, etc

This toolbox contains functions for DISCRETE random variables to compute following quantities:

1)Entropy

2)Joint entropy

3)Conditional entropy

4)Relative entropy (KL divergence)

5)Mutual information

6)Normalized mutual information

7)Normalized variation information

This package is now a part of the PRML toolbox (http://www.mathworks.com/matlabcentral/fileexchange/55826-pattern-recognition-and-machine-learning-toolbox).

Mo Chen (2021). Information Theory Toolbox (https://www.mathworks.com/matlabcentral/fileexchange/35625-information-theory-toolbox), MATLAB Central File Exchange. Retrieved .

Created with
R2011b

Compatible with any release

**Inspired by:**
Normalized Mutual Information, Pattern Recognition and Machine Learning Toolbox

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!Create scripts with code, output, and formatted text in a single executable document.

changkun怎样计算多个随机变量的联合熵呢？

H(x1,x2,...,xn)

驰 张qinyuan luoit can only deal with integer

CAI Daniel Yik HongThe mutual information function should allow the input vectors to be of different length!

mirzaadilammar baigRey9Comming the same problem which @Maksim :

who knows why

nmi(randi(1000,1,1e3),randi(1000,1,1e3))

returns 0.91

?

They're different series of numbers, so how they share similar information?

Shuai FengCan the program only handle integer data？

changkunFawad MasoodCan any one tell me maximum possible value must achieve according to simulation based for these tests

Mohamed EL-Raghyahmed silikI think the problem is here

sparse(idx,x,1,n,k,n)

Karel MundnichIn the conditional entropy, you cannot calculate the joint distribution from marginal distributions. The joint distribution should be one of the arguments of the function.

RomeshI don't think either of the proposed solutions provided by Francesco and Subash are correct. If you have

a=randint(1,1000,[1 5]);

entropy(a)

mutualInformation(a,a)

we know that mathematically these must give the same result. The original code does, whereas Francesco's change doesn't. So simply reversing the order is incorrect.

The underlying error is that the code expects x and y to be positive integers. Rounding a continuous variable will mean that you have valid indexes (except if the input has a value that rounds to zero). However, you could consider this as being analogous to binning the data, except that if multiple points go into the same bin, that bin will only ever have a value of 1. So I suspect Subash's suggestion also invalidates the calculation.

The real answer is actually provided by the author in the package description: "This toolbox contains functions for discrete random variables". These functions should only be used for DISCRETE variables x and y that contain positive integers. A different approach must be used if one or both of the variables is continuous.

ArvindHey guys, regarding sparse function error, which answer is correct of the below (as answered by francesco and subash)--

Mx=sparse(idx,1,x,n,k,n);

Or

Mx = sparse(idx, round(x), 1,n,k,n);

Both of these give different result so only one of them should be correct.

Francesco Onoratito avoid the error when calling sparse function, just invert x (and y) with 1.

Mx=sparse(idx,1,x,n,k,n);

Please next time refer to the help of sparse function before asking :)

Anuja KelkarIs the output of the conditionalEntropy function a normalized value? I ask this because, I computed conditional entropy myself with the aid of MutualInformation function and MATLAB's entropy() method. I had got values of conditional Entropy to be greater than 1, which was expected. However, I am getting all conditional entropy values < 1 using InfoTheory toolbox's conditonalEntropy() function.

Has the output been normalized?

Please let me know. Thanks

ParthaI got different result using entropy(sig) and wentropy(sig,'shannon'). Can any one explain this?

Subash PadmanabanChange line 16:

Mx = sparse(idx, round(x), 1,n,k,n);

Apply the same changes to all sparse operations if the program throws the same error.

Nejc IlcVery useful and efficient toolbox, thank you. However, there is a bug in the nmi.m. last sentence should read:

z = sqrt((MI/Hx)*(MI/Hy));

Output variable is "z" and not "v". But this is obvious a typo, so it does not influence my rating.

shi??? Error using ==> sparse

Index into matrix must be an integer.

Error in ==> mutualInformation at 16

Mx = sparse(idx,x,1,n,k,n);

is there anybody can help me?

MaksimTake back my last comment.

MaksimMaksimnmi(1:1000,randi(1000,1,1e3))

returns 0.96

nmi(randi(1000,1,1e3),randi(1000,1,1e3))

returns 0.91

Are you sure this is working correctly?

Zulkifli HidayatIn Windows 7 64-bit and using Matlab 2011b 64-bit, I got error for the following simple code:

x = randn(10,1);

y = randn(10,1);

z = mutualInformation(x,y)

Error message:

Error using sparse

Sparse matrix sizes must be non-negative integers less than MAXSIZE as defined by

COMPUTER. Use HELP COMPUTER for more details.

Error in mutualInformation (line 16)

Mx = sparse(idx,x,1,n,k,n);

Is there anybody tell me why?

Jeffali AbusninaIs there anyway how can we make these measure on time cries data?

can anyone help please