Kullback-Leibler Divergence for NMF in Matlab
6 views (last 30 days)
Show older comments
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
- Euclidean distance for matrix factorization has the following structure.
![11.png](https://www.mathworks.com/matlabcentral/answers/uploaded_files/200142/11.png)
which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below
![22.png](https://www.mathworks.com/matlabcentral/answers/uploaded_files/200143/22.png)
I wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.
0 Comments
Accepted Answer
Matt Tearle
on 16 Jan 2019
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.
0 Comments
More Answers (0)
See Also
Categories
Find more on Logical in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!