Shannon's Information Entropy


Output: Press calculate

Formula: H(X) = -Σ (p(x) * log2(p(x)))

Shannon's information entropy is a measure of the uncertainty in a random variable. In the formula, H(X) represents the entropy of the random variable X, and p(x) is the probability of an individual symbol from X. The summation (Σ) runs over all possible symbols. The entropy is zero when one outcome is certain (probability is 1), and it increases as the distribution of outcomes becomes more uniform. In the context of information theory, it quantifies the amount of information or surprise associated with each outcome. It can be applied in various fields such as data compression, cryptography, and communication systems.

Tags: Information Theory, Entropy, Shannon, Probability