Difference between revisions of "161-A3.1"

From Microlab Classes
Jump to navigation Jump to search
Line 60: Line 60:
  
 
{{NumBlk|::|<math> H\left(X\mid Y\right)
 
{{NumBlk|::|<math> H\left(X\mid Y\right)
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(y_j\right)}{P\left(x_i, y_j\right)}\right)=\frac{11}{8}\,\mathrm{bits}</math>|{{EquationRef|6}}}}
+
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(y_j\right)}{P\left(x_i, y_j\right)}\right)=\frac{11}{8}\,\mathrm{bits}=1.375\,\mathrm{bits}</math>|{{EquationRef|6}}}}
  
 
{{NumBlk|::|<math> H\left(Y\mid X\right)
 
{{NumBlk|::|<math> H\left(Y\mid X\right)
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(x_i\right)}{P\left(x_i, y_j\right)}\right)=\frac{13}{8}\,\mathrm{bits}</math>|{{EquationRef|7}}}}
+
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(x_i\right)}{P\left(x_i, y_j\right)}\right)=\frac{13}{8}\,\mathrm{bits}=1.625\,\mathrm{bits}</math>|{{EquationRef|7}}}}
 
 
  
 +
Note that <math>H\left(X\mid Y\right)\ne H\left(Y\mid X\right)</math>.
  
 
== Example 2: A Noiseless Binary Channel ==
 
== Example 2: A Noiseless Binary Channel ==

Revision as of 11:50, 29 September 2020

  • Activity: Mutual Information and Channel Capacity
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Calculate the channel capacity of different channel models.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Example 1: Mutual Information

Given the following probabilities:

: Blood Type, : Chance for Skin Cancer
A B AB O
Very Low 1/8 1/16 1/32 1/32
Low 1/16 1/8 1/32 1/32
Medium 1/16 1/16 1/16 1/16
High 1/4 0 0 0

To get the entropies of and , we need to calculate the marginal probabilities:

 

 

 

 

(1)

 

 

 

 

(2)

And since:

 

 

 

 

(3)

We get:

 

 

 

 

(4)

 

 

 

 

(5)

Calculating the conditional entropies using:

 

 

 

 

(6)

 

 

 

 

(7)

Note that .

Example 2: A Noiseless Binary Channel

Example 3: A Noisy Channel with Non-Overlapping Outputs

Example 4: The Binary Symmetric Channel (BSC)

Sources

  • Yao Xie's slides on Entropy and Mutual Information