This toolbox contains functions for discrete random variables to compute following quantities:
1)Entropy
2)Joint entropy
3)Conditional entropy
4)Relative entropy (KL divergence)
5)Mutual information
6)Normalized mutual information
7)Normalized variation information
This toolbox is a tweaked and bundled version of my previous submissions.
Note( previous single function submission will be removed soon).
In the conditional entropy, you cannot calculate the joint distribution from marginal distributions. The joint distribution should be one of the arguments of the function.
we know that mathematically these must give the same result. The original code does, whereas Francesco's change doesn't. So simply reversing the order is incorrect.
The underlying error is that the code expects x and y to be positive integers. Rounding a continuous variable will mean that you have valid indexes (except if the input has a value that rounds to zero). However, you could consider this as being analogous to binning the data, except that if multiple points go into the same bin, that bin will only ever have a value of 1. So I suspect Subash's suggestion also invalidates the calculation.
The real answer is actually provided by the author in the package description: "This toolbox contains functions for discrete random variables". These functions should only be used for DISCRETE variables x and y that contain positive integers. A different approach must be used if one or both of the variables is continuous.
to avoid the error when calling sparse function, just invert x (and y) with 1.
Mx=sparse(idx,1,x,n,k,n);
Please next time refer to the help of sparse function before asking :)
Is the output of the conditionalEntropy function a normalized value? I ask this because, I computed conditional entropy myself with the aid of MutualInformation function and MATLAB's entropy() method. I had got values of conditional Entropy to be greater than 1, which was expected. However, I am getting all conditional entropy values < 1 using InfoTheory toolbox's conditonalEntropy() function.
Has the output been normalized?
Please let me know. Thanks
Very useful and efficient toolbox, thank you. However, there is a bug in the nmi.m. last sentence should read:
z = sqrt((MI/Hx)*(MI/Hy));
Output variable is "z" and not "v". But this is obvious a typo, so it does not influence my rating.