Difference between revisions of "161-A3.1"

From Microlab Classes
Jump to navigation Jump to search
Line 54: Line 54:
 
We get:  
 
We get:  
  
{{NumBlk|::|<math>H\left(X\right) = \frac{1}{2}\log_2 2 + \frac{1}{4}\log_2 4 +\frac{1}{8}\log_2 8 +\frac{1}{8}\log_2 8 = \frac{7}{4}\,\mathrm{bits}</math>|{{EquationRef|4}}}}
+
{{NumBlk|::|<math>H\left(X\right) = \frac{1}{2}\log_2 2 + \frac{1}{4}\log_2 4 +\frac{1}{8}\log_2 8 +\frac{1}{8}\log_2 8 = \frac{7}{4}\,\mathrm{bits}=1.75\,\mathrm{bits}</math>|{{EquationRef|4}}}}
 
{{NumBlk|::|<math>H\left(Y\right) = \frac{1}{4}\log_2 4 + \frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 = 2\,\mathrm{bits}</math>|{{EquationRef|5}}}}
 
{{NumBlk|::|<math>H\left(Y\right) = \frac{1}{4}\log_2 4 + \frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 +\frac{1}{4}\log_2 4 = 2\,\mathrm{bits}</math>|{{EquationRef|5}}}}
  
Line 65: Line 65:
 
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(x_i\right)}{P\left(x_i, y_j\right)}\right)=\frac{13}{8}\,\mathrm{bits}=1.625\,\mathrm{bits}</math>|{{EquationRef|7}}}}
 
=\sum_{i=1}^4 \sum_{j=1}^4 P\left(x_i, y_j\right)\cdot\log_2\left(\frac{P\left(x_i\right)}{P\left(x_i, y_j\right)}\right)=\frac{13}{8}\,\mathrm{bits}=1.625\,\mathrm{bits}</math>|{{EquationRef|7}}}}
  
Note that <math>H\left(X\mid Y\right)\ne H\left(Y\mid X\right)</math>.
+
Note that <math>H\left(X\mid Y\right)\ne H\left(Y\mid X\right)</math>. Calculating the mutual information, we get:
 +
 
 +
{{NumBlk|::|<math>I\left(A;B\right)=H\left(X\right)-H\left(X\mid Y\right)=\frac{7}{4}-\frac{11}{8}=0.375\,\mathrm{bits}</math>|{{EquationRef|8}}}}
 +
 
 +
Or equivalently:
 +
 
 +
{{NumBlk|::|<math>I\left(A;B\right)=H\left(Y\right)-H\left(Y\mid X\right)=2-\frac{13}{8}=0.375\,\mathrm{bits}</math>|{{EquationRef|9}}}}
  
 
== Example 2: A Noiseless Binary Channel ==
 
== Example 2: A Noiseless Binary Channel ==

Revision as of 15:51, 29 September 2020

  • Activity: Mutual Information and Channel Capacity
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Calculate the channel capacity of different channel models.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Example 1: Mutual Information

Given the following probabilities:

: Blood Type, : Chance for Skin Cancer
A B AB O
Very Low 1/8 1/16 1/32 1/32
Low 1/16 1/8 1/32 1/32
Medium 1/16 1/16 1/16 1/16
High 1/4 0 0 0

To get the entropies of and , we need to calculate the marginal probabilities:

 

 

 

 

(1)

 

 

 

 

(2)

And since:

 

 

 

 

(3)

We get:

 

 

 

 

(4)

 

 

 

 

(5)

Calculating the conditional entropies using:

 

 

 

 

(6)

 

 

 

 

(7)

Note that . Calculating the mutual information, we get:

 

 

 

 

(8)

Or equivalently:

 

 

 

 

(9)

Example 2: A Noiseless Binary Channel

Example 3: A Noisy Channel with Non-Overlapping Outputs

Example 4: The Binary Symmetric Channel (BSC)

Sources

  • Yao Xie's slides on Entropy and Mutual Information