Tuesday, October 15, 2013

Entropy of entropy

They say that that there is a logarithm in measure of information because we want the information be additive (extensive property). So when you have a variable that can take 1 of n values, we say that the each value has information content of log n rather than n. I apply the same principle to log n and see that it carries no more than log log n bits of information. Does it mean that the original message had no more bits than log log n?

No comments: