Difference between revisions of "161-A3.1"
Jump to navigation
Jump to search
Line 66: | Line 66: | ||
{{NumBlk|::|<math>\begin{align} | {{NumBlk|::|<math>\begin{align} | ||
H\left(X\mid Y\right) | H\left(X\mid Y\right) | ||
− | + | = &\frac{1}{8}\log_2\frac{1/4}{1/8} + \frac{1}{16}\log_2\frac{1/4}{1/16} + \frac{1}{16}\log_2\frac{1/4}{1/16} + \frac{1}{4}\log_2\frac{1/4}{1/4}\\ | |
& + \frac{1}{16}\log_2\frac{1/4}{1/16} + \frac{1}{8}\log_2\frac{1/4}{1/8} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | & + \frac{1}{16}\log_2\frac{1/4}{1/16} + \frac{1}{8}\log_2\frac{1/4}{1/8} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | ||
& + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | & + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | ||
& + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | & + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{32}\log_2\frac{1/4}{1/32} + \frac{1}{16}\log_2\frac{1/4}{1/16} + 0\log_2\frac{1/4}{0}\\ | ||
− | & | + | = & 1 |
\end{align}</math>|{{EquationRef|7}}}} | \end{align}</math>|{{EquationRef|7}}}} | ||
Revision as of 10:45, 29 September 2020
- Activity: Mutual Information and Channel Capacity
- Instructions: In this activity, you are tasked to
- Walk through the examples.
- Calculate the channel capacity of different channel models.
- Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.
Contents
Example 1: Mutual Information
Given the following probabilities:
A | B | AB | O | |
---|---|---|---|---|
Very Low | 1/8 | 1/16 | 1/32 | 1/32 |
Low | 1/16 | 1/8 | 1/32 | 1/32 |
Medium | 1/16 | 1/16 | 1/16 | 1/16 |
High | 1/4 | 0 | 0 | 0 |
To get the entropies of and , we need to calculate the marginal probabilities:
-
(1)
-
-
(2)
-
And since:
-
(3)
-
We get:
-
(4)
-
-
(5)
-
Calculating the conditional entropies using:
-
(6)
-
We get:
-
(7)
-
Example 2: A Noiseless Binary Channel
Example 3: A Noisy Channel with Non-Overlapping Outputs
Example 4: The Binary Symmetric Channel (BSC)
Sources
- Yao Xie's slides on Entropy and Mutual Information