Difference between revisions of "Entropy, Relative Entropy, Mutual Information"

From Microlab Classes
Jump to navigation Jump to search
Line 9: Line 9:
 
* a measure of the distance between two distributions
 
* a measure of the distance between two distributions
 
* a measure of the inefficiency of assuming that the distribution is <math>q</math> when the true distribution is <math>p</math>.
 
* a measure of the inefficiency of assuming that the distribution is <math>q</math> when the true distribution is <math>p</math>.
 +
 +
== Mutual Information ==
 +
* a measure of the amount of information that one random variable contains about another random variable.

Revision as of 12:56, 25 June 2020

Definitions

Entropy

  • a measure of the uncertainty of a random variable
  • The entropy of a random variable is a measure of the uncertainty of the random variable
    • it is a measure of the amount of information required on the average to describe the random variable

Relative Entropy

  • a measure of the distance between two distributions
  • a measure of the inefficiency of assuming that the distribution is when the true distribution is .

Mutual Information

  • a measure of the amount of information that one random variable contains about another random variable.