Difference between revisions of "Entropy, Relative Entropy, Mutual Information"

From Microlab Classes
Jump to navigation Jump to search
 
(7 intermediate revisions by the same user not shown)
Line 20: Line 20:
 
** it is a measure of the amount of information required on the average to describe the random variable
 
** it is a measure of the amount of information required on the average to describe the random variable
  
  Uniform distributions have the highest uncertainties!
+
=== Desired Properties<ref>[https://towardsdatascience.com/entropy-is-a-measure-of-uncertainty-e2c000301c2c?gi=624b56e1fe17]</ref> ===
 +
# Uniform distributions have maximum uncertainty.
 +
# Uncertainty is additive for independent events.
 +
# Adding an outcome with zero probability has no effect.
 +
# The measure of uncertainty is continuous in all its arguments.
 +
# Uniform distributions with more outcomes have more uncertainty.
 +
# Events have non-negative uncertainty.
 +
# Events with a certain outcome have zero uncertainty.
 +
# Flipping the arguments has no effect.
  
 +
=== Formulation ===
 
The ''entropy'' of a discrete random variable, <math>X</math>, is
 
The ''entropy'' of a discrete random variable, <math>X</math>, is
  
Line 65: Line 74:
  
 
{{NumBlk|:|<math>H\left(X, Y\right)=-\sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} p\left(x, y\right)\cdot \log_2 p\left(x, y\right)</math>|{{EquationRef|7}}}}
 
{{NumBlk|:|<math>H\left(X, Y\right)=-\sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} p\left(x, y\right)\cdot \log_2 p\left(x, y\right)</math>|{{EquationRef|7}}}}
 +
 +
 +
== References ==
 +
<references />

Latest revision as of 09:39, 9 September 2020

Definitions

Entropy

  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Relative Entropy

  • a measure of the distance between two distributions
  • a measure of the inefficiency of assuming that the distribution is when the true distribution is .

Mutual Information

  • a measure of the amount of information that one random variable contains about another random variable

Entropy

Definitions:

  • Shannon entropy
  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Desired Properties[1]

  1. Uniform distributions have maximum uncertainty.
  2. Uncertainty is additive for independent events.
  3. Adding an outcome with zero probability has no effect.
  4. The measure of uncertainty is continuous in all its arguments.
  5. Uniform distributions with more outcomes have more uncertainty.
  6. Events have non-negative uncertainty.
  7. Events with a certain outcome have zero uncertainty.
  8. Flipping the arguments has no effect.

Formulation

The entropy of a discrete random variable, , is

 

 

 

 

(1)

where has a probability mass function (pmf), , and an alphabet .

Expected Value

For a discrete random variable, , with probability mass function, , the expected value of is

 

 

 

 

(2)

For a discrete random variable, , with probability mass function, , the expected value of is

 

 

 

 

(3)

Consider the case where . We get

 

 

 

 

(4)

Lemma 1: Entropy is greater than or equal to zero

 

 

 

 

(5)

Proof: Since , then , and subsequently, . Thus from Eq. (4) we get .

Lemma 2: Changing the logarithm base

 

 

 

 

(6)

Proof:

  • Given that
  • And since
  • We get

Note that the entropy, , has units of bits for , or nats (natural units) for , or dits (decimal digits) for .

Joint Entropy

Definition:

  • a measure of the uncertainty associated with a set of variables

The joint entropy of a pair of discrete random variables with joint pmf is defined as

 

 

 

 

(7)


References