Difference between revisions of "The Data Processing Inequality"
Jump to navigation
Jump to search
Line 12: | Line 12: | ||
{{NumBlk|::|<math>P\left(x, y, z\right) = P\left(z\mid y\right)\cdot P\left(y\mid x\right) \cdot P\left(x\right)</math>|{{EquationRef|3}}}} | {{NumBlk|::|<math>P\left(x, y, z\right) = P\left(z\mid y\right)\cdot P\left(y\mid x\right) \cdot P\left(x\right)</math>|{{EquationRef|3}}}} | ||
+ | We can use Markov chains to model how a signal is corrupted when passed through noisy channels. For example, if <math>X</math> is a binary signal, it can change with a certain probability, <math>p</math> to <math>Y</math>, and it can again be corrupted to produce <math>Z</math>. | ||
+ | |||
+ | Consider the joint probability <math>P\left(x, z\mid y\right)</math>. We can express this as: | ||
+ | |||
+ | {{NumBlk|::|<math>P\left(x, z\mid y\right) = \frac{P\left(x, y, z\right)}{P\left(y\right)}</math>|{{EquationRef|4}}}} | ||
+ | |||
+ | And if <math>X \rightarrow Y \rightarrow Z</math>, we get: | ||
+ | |||
+ | {{NumBlk|::|<math>P\left(x, z\mid y\right) = \frac{P\left(z\mid y\right)\cdot P\left(y\mid x\right) \cdot P\left(x\right)}{P\left(y\right)}</math>|{{EquationRef|5}}}} | ||
== The Data Processing Inequality == | == The Data Processing Inequality == |
Revision as of 10:25, 23 October 2020
Markovity
A Markov Chain is a random process that describes a sequence of possible events where the probability of each event depends only on the outcome of the previous event. Thus, we say that is a Markov chain in this order, denoted as:
-
(1)
-
If we can write:
-
(2)
-
Or in a more compact form:
-
(3)
-
We can use Markov chains to model how a signal is corrupted when passed through noisy channels. For example, if is a binary signal, it can change with a certain probability, to , and it can again be corrupted to produce .
Consider the joint probability . We can express this as:
-
(4)
-
And if , we get:
-
(5)
-