Difference between revisions of "161-A3.1"

From Microlab Classes
Jump to navigation Jump to search
Line 66: Line 66:
 
{{NumBlk|::|<math>
 
{{NumBlk|::|<math>
 
H\left(X\mid Y\right)
 
H\left(X\mid Y\right)
= \frac{1}{8}\log_2\frac{\frac{1}{4}}{\frac{1}{8}} + \frac{1}{16}\log_2\frac{\frac{1}{4}}{\frac{1}{16}} + \frac{1}{8}\log_2\frac{\frac{1}{4}}{\frac{1}{8}} + \frac{1}{4}\log_2\frac{\frac{1}{4}}{\frac{1}{4}}
+
= \frac{1}{8}\log_2\frac{1/4}{1/8} + \frac{1}{16}\log_2\frac{\frac{1}{4}}{\frac{1}{16}} + \frac{1}{8}\log_2\frac{\frac{1}{4}}{\frac{1}{8}} + \frac{1}{4}\log_2\frac{\frac{1}{4}}{\frac{1}{4}}
 
</math>|{{EquationRef|7}}}}
 
</math>|{{EquationRef|7}}}}
  

Revision as of 10:36, 29 September 2020

  • Activity: Mutual Information and Channel Capacity
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Calculate the channel capacity of different channel models.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Example 1: Mutual Information

Given the following probabilities:

: Blood Type, : Chance for Skin Cancer
A B AB O
Very Low 1/8 1/16 1/32 1/32
Low 1/16 1/8 1/32 1/32
Medium 1/16 1/16 1/16 1/16
High 1/4 0 0 0

To get the entropies of and , we need to calculate the marginal probabilities:

 

 

 

 

(1)

 

 

 

 

(2)

And since:

 

 

 

 

(3)

We get:

 

 

 

 

(4)

 

 

 

 

(5)

Calculating the conditional entropies using:

 

 

 

 

(6)

We get:

 

 

 

 

(7)


Example 2: A Noiseless Binary Channel

Example 3: A Noisy Channel with Non-Overlapping Outputs

Example 4: The Binary Symmetric Channel (BSC)

Sources

  • Yao Xie's slides on Entropy and Mutual Information