Entropy, Relative Entropy, Mutual Information

From Microlab Classes
Jump to navigation Jump to search

Definitions

Entropy

  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Relative Entropy

  • a measure of the distance between two distributions
  • a measure of the inefficiency of assuming that the distribution is when the true distribution is .

Mutual Information

  • a measure of the amount of information that one random variable contains about another random variable

Entropy

Definitions:

  • Shannon entropy
  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Desired Properties[1]

  1. Uniform distributions have maximum uncertainty.
  2. Uncertainty is additive for independent events.
  3. Adding an outcome with zero probability has no effect.
  4. The measure of uncertainty is continuous in all its arguments.
  5. Uniform distributions with more outcomes have more uncertainty.
  6. Events have non-negative uncertainty.
  7. Events with a certain outcome have zero uncertainty.
  8. Flipping the arguments has no effect.

Formulation

The entropy of a discrete random variable, , is

 

 

 

 

(1)

where has a probability mass function (pmf), , and an alphabet .

Expected Value

For a discrete random variable, , with probability mass function, , the expected value of is

 

 

 

 

(2)

For a discrete random variable, , with probability mass function, , the expected value of is

 

 

 

 

(3)

Consider the case where . We get

 

 

 

 

(4)

Lemma 1: Entropy is greater than or equal to zero

 

 

 

 

(5)

Proof: Since , then , and subsequently, . Thus from Eq. (4) we get .

Lemma 2: Changing the logarithm base

 

 

 

 

(6)

Proof:

  • Given that
  • And since
  • We get

Note that the entropy, , has units of bits for , or nats (natural units) for , or dits (decimal digits) for .

Joint Entropy

Definition:

  • a measure of the uncertainty associated with a set of variables

The joint entropy of a pair of discrete random variables with joint pmf is defined as

 

 

 

 

(7)


References