Difference between revisions of "161-A3.1"

From Microlab Classes
Jump to navigation Jump to search
Line 47: Line 47:
 
{{NumBlk|::|<math>P_X = \{\tfrac{1}{2}, \tfrac{1}{4}, \tfrac{1}{8}, \tfrac{1}{8}\}</math>|{{EquationRef|1}}}}
 
{{NumBlk|::|<math>P_X = \{\tfrac{1}{2}, \tfrac{1}{4}, \tfrac{1}{8}, \tfrac{1}{8}\}</math>|{{EquationRef|1}}}}
 
{{NumBlk|::|<math>P_Y = \{\tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}\}</math>|{{EquationRef|2}}}}
 
{{NumBlk|::|<math>P_Y = \{\tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}, \tfrac{1}{4}\}</math>|{{EquationRef|2}}}}
 +
 +
And since:
 +
 +
{{NumBlk|::|<math>H\left(A\right)=\sum_{i=1}^n P\left(a_i\right)\cdot\log_2\left(\frac{1}{P\left(a_i\right)}\right)</math>|{{EquationRef|3}}}}
  
 
Thus,  
 
Thus,  
  
{{NumBlk|::|<math>H\left(X\right) = \frac{1}{2}\log_2 2 + \frac{1}{4}\log_2 4 +\frac{1}{8}\log_2 8 +\frac{1}{2}\log_2 8 = \frac{7}{4}</math>|{{EquationRef|3}}}}
+
{{NumBlk|::|<math>H\left(X\right) = \frac{1}{2}\log_2 2 + \frac{1}{4}\log_2 4 +\frac{1}{8}\log_2 8 +\frac{1}{8}\log_2 8 = \frac{7}{4}\,\mathrm{bits}</math>|{{EquationRef|4}}}}
 +
{{NumBlk|::|<math>H\left(Y\right) = \frac{1}{4}\log_2 4 + \frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 = 2\,\mathrm{bits}</math>|{{EquationRef|5}}}}
 +
 
 +
We can calculate the conditional entropies as:
  
 
== Example 2: A Noiseless Binary Channel ==
 
== Example 2: A Noiseless Binary Channel ==

Revision as of 09:42, 29 September 2020

  • Activity: Mutual Information and Channel Capacity
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Calculate the channel capacity of different channel models.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Example 1: Mutual Information

Given the following probabilities:

: Blood Type, : Chance for Skin Cancer
A B AB O
Very Low 1/8 1/16 1/32 1/32
Low 1/16 1/8 1/32 1/32
Medium 1/16 1/16 1/16 1/16
High 1/4 0 0 0

To get the entropies of and , we need to calculate the marginal probabilities:

 

 

 

 

(1)

 

 

 

 

(2)

And since:

 

 

 

 

(3)

Thus,

 

 

 

 

(4)

 

 

 

 

(5)

We can calculate the conditional entropies as:

Example 2: A Noiseless Binary Channel

Example 3: A Noisy Channel with Non-Overlapping Outputs

Example 4: The Binary Symmetric Channel (BSC)

Sources

  • Yao Xie's slides on Entropy and Mutual Information