Entropy, Relative Entropy, Mutual Information

From Microlab Classes
Jump to navigation Jump to search

Definitions

Entropy

  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Relative Entropy

  • a measure of the distance between two distributions
  • a measure of the inefficiency of assuming that the distribution is when the true distribution is .

Mutual Information

  • a measure of the amount of information that one random variable contains about another random variable

Entropy

The entropy of a discrete random variable, , is

 

 

 

 

(1)

where has a probability mass function (pmf), , and an alphabet .

Expected Value

Given a random variable, with probability mass function , the expected value of is