Difference between revisions of "Entropy, Relative Entropy, Mutual Information"
Jump to navigation
Jump to search
Line 20: | Line 20: | ||
where <math>X</math> has a probability mass function (pmf), <math>p\left(x\right)</math>, and an alphabet <math>\mathcal{X}</math>. | where <math>X</math> has a probability mass function (pmf), <math>p\left(x\right)</math>, and an alphabet <math>\mathcal{X}</math>. | ||
− | === | + | === Expected Value === |
+ | Given a random variable, <math>X</math> with probability mass function <math>p\left(x\right)</math>, the expected value of <math>g\left(X\right)</math> is |
Revision as of 13:36, 25 June 2020
Contents
Definitions
Entropy
- a measure of the uncertainty of a random variable
- The entropy of a random variable is a measure of the uncertainty of the random variable
- it is a measure of the amount of information required on the average to describe the random variable
Relative Entropy
- a measure of the distance between two distributions
- a measure of the inefficiency of assuming that the distribution is when the true distribution is .
Mutual Information
- a measure of the amount of information that one random variable contains about another random variable
Entropy
The entropy of a discrete random variable, , is
-
(1)
where has a probability mass function (pmf), , and an alphabet .
Expected Value
Given a random variable, with probability mass function , the expected value of is