Difference between revisions of "Entropy, Relative Entropy, Mutual Information"
Jump to navigation
Jump to search
Line 12: | Line 12: | ||
=== Mutual Information === | === Mutual Information === | ||
* a measure of the amount of information that one random variable contains about another random variable | * a measure of the amount of information that one random variable contains about another random variable | ||
+ | |||
+ | == Entropy == | ||
+ | The entropy of a discrete random variable is | ||
+ | |||
+ | <math>H\left(X\right)=-\sum_{x} p\left(x\right) \log_2\left(p\left(x\right)\right)</math> |
Revision as of 13:20, 25 June 2020
Definitions
Entropy
- a measure of the uncertainty of a random variable
- The entropy of a random variable is a measure of the uncertainty of the random variable
- it is a measure of the amount of information required on the average to describe the random variable
Relative Entropy
- a measure of the distance between two distributions
- a measure of the inefficiency of assuming that the distribution is when the true distribution is .
Mutual Information
- a measure of the amount of information that one random variable contains about another random variable
Entropy
The entropy of a discrete random variable is