This toolbox contains functions for discrete random variables to compute following quantities:
1)Entropy
2)Joint entropy
3)Conditional entropy
4)Relative entropy (KL divergence)
5)Mutual information
6)Normalized mutual information
7)Normalized variation information
This toolbox is a tweaked and bundled version of my previous submissions.
Note( previous single function submission will be removed soon).
to avoid the error when calling sparse function, just invert x (and y) with 1.
Mx=sparse(idx,1,x,n,k,n);
Please next time refer to the help of sparse function before asking :)
Is the output of the conditionalEntropy function a normalized value? I ask this because, I computed conditional entropy myself with the aid of MutualInformation function and MATLAB's entropy() method. I had got values of conditional Entropy to be greater than 1, which was expected. However, I am getting all conditional entropy values < 1 using InfoTheory toolbox's conditonalEntropy() function.
Has the output been normalized?
Please let me know. Thanks
Very useful and efficient toolbox, thank you. However, there is a bug in the nmi.m. last sentence should read:
z = sqrt((MI/Hx)*(MI/Hy));
Output variable is "z" and not "v". But this is obvious a typo, so it does not influence my rating.