Difference between revisions of "The Data Processing Inequality"

From Microlab Classes
Jump to navigation Jump to search
Line 28: Line 28:
 
Thus, we can say that <math>X</math> and <math>Z</math> are conditionally independent given <math>Y</math>. If we think of <math>X</math> as some past event, and <math>Z</math> as some future event, then the past and future events are independent if we know the present event <math>Y</math>. Note that this property is good definition of, as well as a useful tool for checking Markovity.
 
Thus, we can say that <math>X</math> and <math>Z</math> are conditionally independent given <math>Y</math>. If we think of <math>X</math> as some past event, and <math>Z</math> as some future event, then the past and future events are independent if we know the present event <math>Y</math>. Note that this property is good definition of, as well as a useful tool for checking Markovity.
  
We can rewrite the joint probability <math>P\left(x, y, z\right) as:
+
We can rewrite the joint probability <math>P\left(x, y, z\right)</math> as:
  
 
{{NumBlk|::|<math>P\left(x, y, z\right) = P\left(z\mid y\right)\cdot P\left(y\mid x\right) \cdot P\left(x\right)=\frac{P\left(z, y\right)}{P\left(y\right)}\cdot P\left(y, x\right)</math>|{{EquationRef|7}}}}
 
{{NumBlk|::|<math>P\left(x, y, z\right) = P\left(z\mid y\right)\cdot P\left(y\mid x\right) \cdot P\left(x\right)=\frac{P\left(z, y\right)}{P\left(y\right)}\cdot P\left(y, x\right)</math>|{{EquationRef|7}}}}

Revision as of 11:15, 23 October 2020

Markovity

A Markov Chain is a random process that describes a sequence of possible events where the probability of each event depends only on the outcome of the previous event. Thus, we say that is a Markov chain in this order, denoted as:

 

 

 

 

(1)

If we can write:

 

 

 

 

(2)

Or in a more compact form:

 

 

 

 

(3)

We can use Markov chains to model how a signal is corrupted when passed through noisy channels. For example, if is a binary signal, it can change with a certain probability, to , and it can again be corrupted to produce .

Consider the joint probability . We can express this as:

 

 

 

 

(4)

And if , we get:

 

 

 

 

(5)

Since , we can write:

 

 

 

 

(6)

Thus, we can say that and are conditionally independent given . If we think of as some past event, and as some future event, then the past and future events are independent if we know the present event . Note that this property is good definition of, as well as a useful tool for checking Markovity.

We can rewrite the joint probability as:

 

 

 

 

(7)

The Data Processing Inequality

Sufficient Statistics

Fano's Inequality