Shannon entropy

Shannon entropy is one of the most important metrics in information theory. Entropy measures the uncertainty associated with a random variable, i.e. the expected value of the information in the message (in classical informatics it is measured in bits).

The concept was introduced by Claude E. Shannon in  the paper „A Mathematical Theory of Communication” (1948). Shannon entropy allows to estimate the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.

Below you will find simple calculator which will help you to understand the concept.

Paste your string (e.g. „1100101″, „Lorem ipsum”) to calculate Shannon entropy


 

The Shannon entropy is calculated using formula:
Shannon entropy formula