In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".(http://planetcalc.com/2476/; visited 2016/08/28)
If p(x(i)) is replaced by its surprise 1/p(x(i)) H(X) can be simplified to: