How to do estimate Non-gaussianity using Negentropy?

31 ビュー (過去 30 日間)
Maria Amr
Maria Amr 2021 年 5 月 22 日
コメント済み: Maria Amr 2021 年 5 月 28 日
I truly appreciate if someone direct me.I am trying to estimate the non-gaussianity using Negentropy. I have already measured the entropy but I don't know how extract the negentropy. Is the negentropy a reverse of entropy? Would appreciate your help.

採用された回答

William Rose
William Rose 2021 年 5 月 27 日
You will find several questions and answers about negentropy if you search Matlab Answers for negentropy.
Negentropy is not the reverse of entropy. Suppose your signal is y. The negentropy of the signal, J(y), is the entropy of a Gaussian noise signal with the same mean and variance as y, minus the entropy of y.
where = entropy of y. A Gaussian signal has maximum possible entropy, so J(y) will be non-negative. The bigger J(y) is, the more non-Gaussian y is. You said you already have code to measure the entropy of your signal. Make a Gaussian signal, , with the same mean and variance as y, and use your code to measure its entropy, and the entropy of your signal (y), and compute J(y). Here's how to make a Gaussian signal with the same mean and variance as y:
yGauss=randn(size(y))*std(y)+mean(y);
Good luck.
  5 件のコメント
William Rose
William Rose 2021 年 5 月 27 日
編集済み: William Rose 2021 年 5 月 27 日
If the formulas in the code are correct, then it should be the case that negentropy will =0 if the signal is Gaussian random noise. Which means e6 should = 1/2*log(2*pi*Var). Let's try it.
y=randn(1,1000); %Gaussian noise with var=stdev=1
p1=hist(y);
e6 = -sum(p1.*log2(p1)); %entropy
Var=var(y); %variance(RCs)
fprintf('e6=%.3f, 1/2*log(2*pi*Var)=%.3f\n',e6,0.5*log(2*pi*Var));
e6=-7498.099, 1/2*log(2*pi*Var)=0.894
We see that e6 is not the expected value, so something is wrong with the formula.
Maria Amr
Maria Amr 2021 年 5 月 28 日
@William Rose, I got the equation from a distinguished paper and I am looking for another equation. Thank you fpr your point!

サインインしてコメントする。

その他の回答 (1 件)

William Rose
William Rose 2021 年 5 月 28 日
I figured it out. Here is an example of a signal y, and how to calculate its entropy (Hent) and its negentropy (J). You may get the values for y from a file - as in your code, where you read it from a spreadsheet. In this case, I am generating the signal y by calling the uniform random number generator, rand().
N=1024; %signal length
y=sqrt(12)*(rand(1,N)-.5); %uniform random noise, mean=0, stdev=1
h=histogram(y,'Normalization','pdf'); %estimate f(x)=the p.d.f.
Hent=-h.BinWidth*sum(h.Values(h.Values>0).*log(h.Values(h.Values>0))/log(2)); %compute entropy
J=log(std(y))/log(2)+2.0471-Hent; %compute negentropy
fprintf('Uniform random noise: Hent=%.4f, J=%.4f\n',Hent,J); %display results
The attached document explains the rationale for the code above.
I am attaching a script which generates four signals: square wave, sine wave, uniform noise, Gaussian noise. The probability density of each one is displayed (below). It is evident from the plots that the density is most un-Gaussian at the top and gradually changes to Gussian at the bottom. The entropy and negentropy of each signal is computed using the formulas above. The values are as expected: negentropy is biggest for the top plot (most un-Gaussian) and smallest (approximately zero) for the bottom. The signals all have the same mean and standard deviation.
  3 件のコメント
William Rose
William Rose 2021 年 5 月 28 日
You're welcome @Maria Amr! If you like the answers you get on this site, then give them a thumbs-up vote.
I have realized something else: Entropy and negentropy are insensitive to the time order of the data. If I take all the values from a sine wave and shuffle their order randomly, it will look like random noise, but the histogram of values will be the same, so its entropy and negentropy will also be the same. Likewise, I could take all the values from a Gaussian random noise signal and re-order them into ascending order. This would look like a very different, and totally non-random, signal. But its entropy and negentropy would be unchanged by this transformation. Therefore entropy and negentropy are not sufficient for evaluating Gaussian randomness. How can you evaluate the time-ordering aspect of randomness? You could estimate the autocorrelation function. You could compute the entropy of the power spectrum of the signal.
Good luck!
Maria Amr
Maria Amr 2021 年 5 月 28 日
@ William Rose, Excellent! I am going to try it in different sampling rate. Thank you!

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeSpectral Estimation についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by