Assumed audience

  • Reading level: general adult.
  • Background: basic arithmetic.
  • Goal: understand entropy as average surprise.

Surprise

A rare event is more surprising than a common event. Information theory measures surprise as:

  • more surprise for smaller probabilities
  • less surprise for larger probabilities

Entropy

Entropy is the average surprise of a distribution. A fair coin has higher entropy than a biased coin.

Why this matters

Entropy is the basic unit for measuring information in data and signals.