- Activity: Mutual Information and Channel Capacity
- Instructions: In this activity, you are tasked to
- Walk through the examples.
- Calculate the channel capacity of different channel models.
- Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.
Example 1: Mutual Information
Given the following probabilities:
: Blood Type, : Chance for Skin Cancer
|
A
|
B
|
AB
|
O
|
Very Low
|
1/8
|
1/16
|
1/32
|
1/32
|
Low
|
1/16
|
1/8
|
1/32
|
1/32
|
Medium
|
1/16
|
1/16
|
1/16
|
1/16
|
High
|
1/4
|
0
|
0
|
0
|
To get the entropies of and , we need to calculate the marginal probabilities:
-
|
|
(1)
|
-
|
|
(2)
|
And since:
-
|
|
(3)
|
We get:
-
|
|
(4)
|
-
|
|
(5)
|
Calculating the conditional entropies using:
-
|
|
(6)
|
-
|
|
(7)
|
Note that . Calculating the mutual information, we get:
-
|
|
(8)
|
Or equivalently:
-
|
|
(9)
|
Example 2: A Noiseless Binary Channel
Example 3: A Noisy Channel with Non-Overlapping Outputs
Example 4: The Binary Symmetric Channel (BSC)
Sources
- Yao Xie's slides on Entropy and Mutual Information