Kullback-Leibler Divergence for NMF in Matlab
16 ビュー (過去 30 日間)
古いコメントを表示
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
- Euclidean distance for matrix factorization has the following structure.

which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below

I wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.
0 件のコメント
採用された回答
Matt Tearle
2019 年 1 月 16 日
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.
0 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Statistics and Machine Learning Toolbox についてさらに検索
製品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!