|
|
Line 197: |
Line 197: |
| | <math>0.5(1-p)^2</math> | | | <math>0.5(1-p)^2</math> |
| | <math>0.5(1-p)^2</math> | | | <math>0.5(1-p)^2</math> |
| + | |- |
| + | |} |
| + | |
| + | In the transition probability matrix, we see that there are four columns where the receiver will not be able to tell whether <math>U_2 = 0</math> or <math>U_2 = 1</math>. We can call such a scenario an ''erasure'', and say that the transmitted bit is erased with probability <math>4\times 0.5p(1-p) = 2p(1-p)</math>. Clearly, we cannot reduce this synthetic channel to a BSC since there are no erasures in a BSC. Regardless, we can still come up the following more manageable reduction: |
| + | |
| + | {| class="wikitable" |
| + | |- |
| + | | |
| + | | 000,110 |
| + | | 001,010,100,111 |
| + | | 011,101 |
| + | |- |
| + | | 0 |
| + | | <math>(1-p)^2</math> |
| + | | <math>2p(1-p)</math> |
| + | | <math>p^2</math> |
| + | |- |
| + | | 1 |
| + | | <math>p^2</math> |
| + | | <math>2p(1-p)</math> |
| + | | <math>(1-p)^2</math> |
| |- | | |- |
| |} | | |} |
Revision as of 05:18, 7 May 2021
Synthetic channels
Binary erasure channel
Let be the erasure probability of a binary erasure channel (BEC).
|
00
|
0?
|
01
|
?0
|
??
|
?1
|
10
|
1?
|
11
|
00
|
|
|
|
|
|
|
|
|
|
01
|
|
|
|
|
|
|
|
|
|
10
|
|
|
|
|
|
|
|
|
|
11
|
|
|
|
|
|
|
|
|
|
|
00
|
0?
|
01
|
?0
|
??
|
?1
|
10
|
1?
|
11
|
0
|
|
|
|
|
|
|
|
|
|
1
|
|
|
|
|
|
|
|
|
|
Binary symmetric channel
Let be the crossover probability of a binary symmetric channel (BSC).
|
00
|
01
|
10
|
11
|
00
|
|
|
|
|
01
|
|
|
|
|
10
|
|
|
|
|
11
|
|
|
|
|
|
00
|
01
|
10
|
11
|
00
|
|
|
|
|
01
|
|
|
|
|
It turns out that the channel reduces to another BSC. To see this, consider the case where . To minimize the error probability, we must decide the value of that has the greater likelihood. For , so that the maximum-likelihood (ML) decision is . Using the same argument, we see that the ML decision is for . More generally, the receiver decision is to set . Indeed, if the crossover probability is low, it is very likely that and . Solving for and plugging it back produces the desired result.
We just saw that despite having a quartenary output alphabet, is equivalent to a BSC. To get the effective crossover probability of this synthetic channel, we just determine the probability that the ML decision is not the same as . This will happen with probability . Intuitively, should have less mutual information compared to the original channel since an independent source interferes with on top of the effects of the BSC.
Checkpoint: Show that for
,
.
Now, let us consider the other synthetic channel . This synthetic channel has a greater mutual information compared to the original BSC due to the "stolen" information about . As with the previous synthetic channel, we can produce the transition probability matrix from to , which now has an eight-element output alphabet. To facilitate the discussion, the columns of the table below have been grouped according to their entries:
|
000
|
110
|
001
|
010
|
100
|
111
|
011
|
101
|
0
|
|
|
|
|
|
|
|
|
1
|
|
|
|
|
|
|
|
|
In the transition probability matrix, we see that there are four columns where the receiver will not be able to tell whether or . We can call such a scenario an erasure, and say that the transmitted bit is erased with probability . Clearly, we cannot reduce this synthetic channel to a BSC since there are no erasures in a BSC. Regardless, we can still come up the following more manageable reduction:
|
000,110
|
001,010,100,111
|
011,101
|
0
|
|
|
|
1
|
|
|
|