Difference between revisions of "Entropy, Relative Entropy, Mutual Information"
Jump to navigation
Jump to search
(Created page with "== Definitions == === Entropy === * a measure of the uncertainty of a random variable * The entropy of a random variable is a measure of the uncertainty of the random variabl...") |
|||
Line 5: | Line 5: | ||
* The entropy of a random variable is a measure of the uncertainty of the random variable | * The entropy of a random variable is a measure of the uncertainty of the random variable | ||
** it is a measure of the amount of information required on the average to describe the random variable | ** it is a measure of the amount of information required on the average to describe the random variable | ||
+ | |||
+ | === Relative Entropy === | ||
+ | * a measure of the distance between two distributions |
Revision as of 12:53, 25 June 2020
Definitions
Entropy
- a measure of the uncertainty of a random variable
- The entropy of a random variable is a measure of the uncertainty of the random variable
- it is a measure of the amount of information required on the average to describe the random variable
Relative Entropy
- a measure of the distance between two distributions