How can i calculate entropy of an image by using the entropy's type?
33 views (last 30 days)
Show older comments
Hello ,
I have a grayscale image 256x256 and i want to calculate its entropy . Basically i want to calculate entropy using this type E= .But i dont really know which value corresponds to which variable . For example i assumed that p=1/256 because this is the propability of a random number between 0...255 .But i am not sure.
Can someone help?
0 Comments
Accepted Answer
Image Analyst
on 3 Jan 2021
There is an entropy function you know:
e = entropy(grayImage);
3 Comments
Image Analyst
on 3 Jan 2021
The help defines it:
Entropy is defined as -sum(p.*log2(p)), where p contains the normalized histogram counts returned from imhist.
so tell me why that formula does not exactly match the formula you gave? Because I'm not seeing it. They look identical to me.
More Answers (1)
KALYAN ACHARJYA
on 3 Jan 2021
Edited: KALYAN ACHARJYA
on 3 Jan 2021
"But i dont really know which value corresponds to which variable"
E=
Let's suppose any gray scale Image of size 10x10
image =
254 20 12 101 227 185 188 52 200 74
85 198 170 15 204 28 248 22 25 154
76 231 154 199 187 30 221 197 75 246
15 136 134 86 13 164 22 52 60 110
76 27 186 155 18 84 93 99 135 177
11 211 181 189 22 167 94 141 23 194
129 86 200 26 204 191 175 58 103 110
194 75 73 32 241 149 153 164 26 167
161 191 177 140 175 189 202 124 28 28
23 2 142 124 33 60 94 38 200 239
1. The First Step to find the probaililities of all probable pixles (From 0 to 255),
Example: The Probaibility of 0 is (P0) = Total Numbers of Occurance of 0 pixel value in the image/size of the image
As there is no "0" pixel within the image, hence P0=0/10x10=0
Simmilarly, the Probaibility of 1 is (P1) = Total Numbers of Occurance of 1 pixel value in the image/size of the image
As there is no "1" pixel within the image, hence P1=0/10x10=0
Same way, the Probaibility of 2 is (P12) = Total Numbers of Occurance of 2 pixel value in the image/size of the image
As there is one "2" pixel within the image, hence P2=1/10x10= 0.01
.............
so on..same way,calculate the probabilities of all pixels values (P3,P4.....P255)
At the end, you may get an array as follows
p=[P0,P1,P2.........P255]
p=[0,0,0.01..........]
2. Second Step: Find the log2 of the p
logP=[log2(P0),log2(P1),.........log(P255)]
Please note here I have considering logP as a variable, In MATLAB, you have to use log(p), where p is 1D array.
In MATLAB, you can directly use log2(p), more about log2 MATLAB fucction here
3. Third Step: Multiple the Probailities with log2(p), its like that, considering log base 2
mul_result=[P0*log2(P0),P1*log2(P1),P2*log2(P2).........P255*log2(P255)]
In MATLAB
mul_result=p.*log2(p);
4. Fourth Step, Add all the result values of the Multiplication result in step 3, afterwards the entropy result as per formula is
Entropy_val=-sum(mul_result);
So in the mentioned formula, k represents all pixel values from 0 to 288 (in the case of uint8 image), p represents the probabilities of individual pixels, you have to get the log (p) of same individual pixels, sum all the data at the end, and put the minus sign, which is a result.
You can implement this in MATLAB on your own, please practice. Or directly you can use MATLAB inbuilt function Entropy here
Good Luck
Kalyan :)
5 Comments
Image Analyst
on 28 Sep 2022
@RENJI what negative value? The entropies are positive. Do you mean the negative sign before the sum? Well the probabilities are less than 1 and the log of a number less than 1 is negative so the negative sign makes the sum positive. Is that what you mean?
RENJI
on 28 Sep 2022
Yes. I was thinking of the negative sign before the sum only. Thankyou very much.
See Also
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!