|
|
Line 165: |
Line 165: |
| | | |
| Now, let us consider the other synthetic channel <math>W^{+}: U_2 \rightarrow (U_1, Y_1, Y_2)</math>. This synthetic channel has a greater mutual information compared to the original BSC due to the "stolen" information about <math>U_1</math>. As with the previous synthetic channel, we can produce the transition probability matrix from <math>U_2</math> to <math>(U_1, Y_1, Y_2)</math>, which now has an eight-element output alphabet. To facilitate the discussion, the columns of the table below have been grouped according to their entries: | | Now, let us consider the other synthetic channel <math>W^{+}: U_2 \rightarrow (U_1, Y_1, Y_2)</math>. This synthetic channel has a greater mutual information compared to the original BSC due to the "stolen" information about <math>U_1</math>. As with the previous synthetic channel, we can produce the transition probability matrix from <math>U_2</math> to <math>(U_1, Y_1, Y_2)</math>, which now has an eight-element output alphabet. To facilitate the discussion, the columns of the table below have been grouped according to their entries: |
| + | |
| + | {| class="wikitable" |
| + | |- |
| + | | |
| + | | 000 |
| + | | 110 |
| + | | 001 |
| + | | 010 |
| + | | 100 |
| + | | 111 |
| + | | 011 |
| + | | 101 |
| + | |- |
| + | | 0 |
| + | |- |
| + | | 1 |
| + | |- |
| + | |} |
Revision as of 05:08, 7 May 2021
Synthetic channels
Binary erasure channel
Let be the erasure probability of a binary erasure channel (BEC).
|
00
|
0?
|
01
|
?0
|
??
|
?1
|
10
|
1?
|
11
|
00
|
|
|
|
|
|
|
|
|
|
01
|
|
|
|
|
|
|
|
|
|
10
|
|
|
|
|
|
|
|
|
|
11
|
|
|
|
|
|
|
|
|
|
|
00
|
0?
|
01
|
?0
|
??
|
?1
|
10
|
1?
|
11
|
0
|
|
|
|
|
|
|
|
|
|
1
|
|
|
|
|
|
|
|
|
|
Binary symmetric channel
Let be the crossover probability of a binary symmetric channel (BSC).
|
00
|
01
|
10
|
11
|
00
|
|
|
|
|
01
|
|
|
|
|
10
|
|
|
|
|
11
|
|
|
|
|
|
00
|
01
|
10
|
11
|
00
|
|
|
|
|
01
|
|
|
|
|
It turns out that the channel reduces to another BSC. To see this, consider the case where . To minimize the error probability, we must decide the value of that has the greater likelihood. For , so that the maximum-likelihood (ML) decision is . Using the same argument, we see that the ML decision is for . More generally, the receiver decision is to set . Indeed, if the crossover probability is low, it is very likely that and . Solving for and plugging it back produces the desired result.
We just saw that despite having a quartenary output alphabet, is equivalent to a BSC. To get the effective crossover probability of this synthetic channel, we just determine the probability that the ML decision is not the same as . This will happen with probability . Intuitively, should have less mutual information compared to the original channel since an independent source interferes with on top of the effects of the BSC.
Checkpoint: Show that for
,
.
Now, let us consider the other synthetic channel . This synthetic channel has a greater mutual information compared to the original BSC due to the "stolen" information about . As with the previous synthetic channel, we can produce the transition probability matrix from to , which now has an eight-element output alphabet. To facilitate the discussion, the columns of the table below have been grouped according to their entries:
|
000
|
110
|
001
|
010
|
100
|
111
|
011
|
101
|
0
|
1
|