Joint entropy, conditional entropy, and mutual information

From Microlab Classes
Revision as of 22:54, 10 February 2022 by Ryan Antonio (talk | contribs) (Created page with "== Joint Entropies == In this module, we'll discuss several extensions of entropy. Let's begin with joint entropy. Suppose we have a random variable <math> X </math> with ele...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Joint Entropies

In this module, we'll discuss several extensions of entropy. Let's begin with joint entropy. Suppose we have a random variable with elements and random variable with elements . We define the joint entropy of as:

 

 

 

 

(1)

Take note that . This is trivial. From our previous discussion about the decision trees of a game, we used joint entropies to calculate the overall entropy.

Conditional Entropies

There are two steps to understand conditional entropies. The first is the uncertainty of a random variable changes when a single outcome occurs. Suppose we have the same random variables and defined earlier in joint entropies. Let's denote as the conditional probability of when event happened. We define the entropy as the entropy of the random variable given a single outcome happened. Mathematically this is:

 

 

 

 

(2)

Stated in a different way, is our measure of uncertainty about when we know occurred. Take note that equation 2 just pertains to the uncertainty when a single event happened. We'll extend this to what would the uncertainty be when any of the events in happens.


Mutual Information

Graphical Interpretation