Am I computing cross entropy incorrectly?
古いコメントを表示
I am working on a neural network and would like to use cross entropy as my error function. I noticed from a previous question that MATLAB added this functionality starting with R2013b. I decided to test the crossentropy function by running the simple example provided in the documentation. The code is reprinted below for convenience:
[x,t] = iris_dataset;
net = patternnet(10);
net = train(net,x,t);
y = net(x);
perf = crossentropy(net,t,y)
When I run this code, I get perf = 0.0367. To verify this result, I ran the code:
ce = -mean(sum(t.*log(y)+(1-t).*log(1-y)))
which resulted in ce = 0.1100. Why are perf and ce unequal? Do I have an error in my calculation?
採用された回答
その他の回答 (3 件)
Greg Heath
2014 年 8 月 21 日
You are using the Xent form for outputs and targets that do not have to sum to 1. The corresponding output transfer function is logsig.
For targets that are constrained to sum to 1, use softmax and the first tern of the sum.
For extensive discussions search in comp.ai.neural-nets using
greg cross entropy
Hope this helps.
Thank you for formally accepting my answer
Greg
2 件のコメント
Matthew Eicholtz
2014 年 8 月 21 日
編集済み: Matthew Eicholtz
2014 年 8 月 21 日
Greg Heath
2014 年 8 月 21 日
You are welcome for the reply. It did answer your question.
The next time you check make sure that you initialize the RNG before you train so that you can duplicate your calculation.
Or Shamir
2017 年 9 月 23 日
ce = -t .* log(y);
perf = sum(ce(:))/numel(ce);
1 件のコメント
Greg Heath
2017 年 9 月 26 日
isn't that the same as
perf = mean(ce(:)); % ?
Tian Li
2017 年 10 月 13 日
0 投票
ce = -t .* log(y); perf = sum(ce(:))/numel(ce);
This is the right answer for muti-class classification error problem
1 件のコメント
Greg Heath
2017 年 10 月 15 日
Why do you think that is different from the last 2 answers???
カテゴリ
ヘルプ センター および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!