Difference between revisions of "161-A2.1"

From Microlab Classes
Jump to navigation Jump to search
Line 19: Line 19:
 
! Code 3
 
! Code 3
 
! Code 4
 
! Code 4
! Code 5
 
 
|-
 
|-
 
| <math>a_1</math>
 
| <math>a_1</math>
Line 27: Line 26:
 
| 0
 
| 0
 
| 0
 
| 0
| 00
 
 
|-
 
|-
 
| <math>a_2</math>
 
| <math>a_2</math>
Line 34: Line 32:
 
| 1
 
| 1
 
| 10
 
| 10
| 01
 
 
| 01
 
| 01
 
|-
 
|-
Line 43: Line 40:
 
| 110
 
| 110
 
| 011
 
| 011
| 10
 
 
|-
 
|-
 
| <math>a_4</math>
 
| <math>a_4</math>
Line 51: Line 47:
 
| 111
 
| 111
 
| 0111
 
| 0111
| 11
 
 
|-
 
|-
 
| colspan="2"|Average Code Length:
 
| colspan="2"|Average Code Length:
Line 58: Line 53:
 
| 1.75
 
| 1.75
 
| 1.875
 
| 1.875
| 2
 
 
|-
 
|-
 
|}
 
|}
Line 68: Line 62:
 
Code 3 has one of the most desirable properties of a code: having its average length equal to the entropy of the source, and according to Shannon, you cannot have a uniquely decodable code with an average length shorter than this. We can also see that for a sequence of symbols encoded using code 3, there is only one way you can decode the message, thus it is uniquely decodable.
 
Code 3 has one of the most desirable properties of a code: having its average length equal to the entropy of the source, and according to Shannon, you cannot have a uniquely decodable code with an average length shorter than this. We can also see that for a sequence of symbols encoded using code 3, there is only one way you can decode the message, thus it is uniquely decodable.
  
Code 4 is also uniquely decodable,
+
Code 4 is also uniquely decodable, however, it is not instantaneous, since the decoder needs to see the start of the next symbol to determine the end of the current symbol. Also note that the average length of code 4 is longer than the <math>H\left(S\right)</math>.
 +
 
 +
We can also calculate <math>K=\sum_{i=1}^n \tfrac{1}{r^\lell_i}</math> for the codes above. Tabulating the values, we get:
 +
 
 +
{| style="wikitable"
 +
! Code
 +
! <math>K</math>
 +
|-
 +
| 1
 +
| 1.75
 +
|-
 +
| 2
 +
| 1.5
 +
|-
 +
| 3
 +
| 1.0
 +
|-
 +
| 4
 +
| 0.9375
 +
|-
 +
|}

Revision as of 22:27, 17 September 2020

  • Activity: Source Coding
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Write a short program to compress and decompress a redundant file.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Uniquely Decodable Codes

Let us try to encode a source with just four symbols in its alphabet, i.e. , with probability distribution . We can calculate the entropy of this source as:

 

 

 

 

(1)

Let us look at a few bit sequence assignments and see if they are uniquely decodable or not.

Symbol Probability Code 1 Code 2 Code 3 Code 4
0.5 0 0 0 0
0.25 0 1 10 01
0.125 1 00 110 011
0.125 10 11 111 0111
Average Code Length: 1.125 1.25 1.75 1.875

Recall that Shannon's Noiseless Coding Theorem states that . Thus, codes 1 and 2 are not uniquely decodable since they have average code lengths less than .

For code 1, since there are two symbols sharing an encoded symbol, we can say that it is not a distinct code, and therefore not uniquely decodable. We can check code 2 by taking a sequence of symbols encoded using this code: . We can easily see that this message can be decoded in different ways, either as , or .

Code 3 has one of the most desirable properties of a code: having its average length equal to the entropy of the source, and according to Shannon, you cannot have a uniquely decodable code with an average length shorter than this. We can also see that for a sequence of symbols encoded using code 3, there is only one way you can decode the message, thus it is uniquely decodable.

Code 4 is also uniquely decodable, however, it is not instantaneous, since the decoder needs to see the start of the next symbol to determine the end of the current symbol. Also note that the average length of code 4 is longer than the .

We can also calculate for the codes above. Tabulating the values, we get:

Code
1 1.75
2 1.5
3 1.0
4 0.9375