Difference between revisions of "161-A3.1"
Jump to navigation
Jump to search
Line 9: | Line 9: | ||
{| class="wikitable" style="text-align: center; width: 45%;" | {| class="wikitable" style="text-align: center; width: 45%;" | ||
− | |+ X: Blood Type, Y: Chance for Skin Cancer | + | |+ <math>X</math>: Blood Type, <math>Y</math>: Chance for Skin Cancer |
|- | |- | ||
! scope="col" | | ! scope="col" | | ||
Line 42: | Line 42: | ||
|- | |- | ||
|} | |} | ||
+ | |||
+ | To get the entropies of <math>X</math> and <math>Y</math>, we need to calculate the marginal probabilities: | ||
+ | |||
+ | {{NumBlk|::|<math>P_X = \{\tfrac{1}{2}, \tfrac{1}{4}, \tfrac{1}{8}, \tfrac{1}{8}\}</math>|{{EquationRef|1}}}} | ||
+ | {{NumBlk|::|<math>P_Y = \{\tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}\}</math>|{{EquationRef|2}}}} | ||
+ | |||
+ | Thus, | ||
+ | |||
+ | {{NumBlk|::|<math>H\left(X\right) = \frac{1}{2}\log_2 2 + \frac{1}{4}\log_2 4 +\frac{1}{8}\log_2 8 +\frac{1}{2}\log_2 8 = \frac{7}{4}</math>|{{EquationRef|3}}}} | ||
== Example 2: A Noiseless Binary Channel == | == Example 2: A Noiseless Binary Channel == |
Revision as of 09:39, 29 September 2020
- Activity: Mutual Information and Channel Capacity
- Instructions: In this activity, you are tasked to
- Walk through the examples.
- Calculate the channel capacity of different channel models.
- Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.
Contents
Example 1: Mutual Information
Given the following probabilities:
A | B | AB | O | |
---|---|---|---|---|
Very Low | 1/8 | 1/16 | 1/32 | 1/32 |
Low | 1/16 | 1/8 | 1/32 | 1/32 |
Medium | 1/16 | 1/16 | 1/16 | 1/16 |
High | 1/4 | 0 | 0 | 0 |
To get the entropies of and , we need to calculate the marginal probabilities:
-
(1)
-
-
(2)
-
Thus,
-
(3)
-
Example 2: A Noiseless Binary Channel
Example 3: A Noisy Channel with Non-Overlapping Outputs
Example 4: The Binary Symmetric Channel (BSC)
Sources
- Yao Xie's slides on Entropy and Mutual Information