Learning and Cognitive Systems

Shannon's Entropy

In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
(http://planetcalc.com/2476/; visited 2016/08/28)

H(X)= - \sum_{i=1}^np(x_i)\log_b p(x_i)

If p(x(i)) is replaced by its surprise 1/p(x(i)) H(X) can be simplified to:

H(X) = Sum(p(x(i)*log2(1/p(x(i))).

WebChurch Code

WebPPL Code