Difference between revisions of "161-A2.1"

From Microlab Classes
Jump to navigation Jump to search
(Created page with "* Activity: '''Source Coding''' * '''Instructions:''' In this activity, you are tasked to ** Walk through the examples. ** Write a short program to compress and decompress a...")
 
Line 4: Line 4:
 
** Write a short program to compress and decompress a redundant file.
 
** Write a short program to compress and decompress a redundant file.
 
* Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.
 
* Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.
 +
 +
== Uniquely Decodable Codes ==
 +
Let us try to encode a source with just four symbols in its alphabet, i.e. <math>A=\{a_1, a_2, a_3, a_4\}</math>, with probability distribution <math>P = \{0.5, 0.25, 0.125, 0.125\}</math>. We can calculate the entropy of this source as:
 +
 +
{{NumBlk|::|<math>H\left(S\right) = 0.5\log_2\left(\frac{1}{0.5}\right) + 0.25\log_2\left(\frac{1}{0.25}\right) + 0.125\log_2\left(\frac{1}{0.125}\right) + 0.125\log_2\left(\frac{1}{0.125}\right) = 1.75\,\mathrm{bits}</math>|{{EquationRef|1}}}}

Revision as of 19:47, 17 September 2020

  • Activity: Source Coding
  • Instructions: In this activity, you are tasked to
    • Walk through the examples.
    • Write a short program to compress and decompress a redundant file.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Uniquely Decodable Codes

Let us try to encode a source with just four symbols in its alphabet, i.e. , with probability distribution . We can calculate the entropy of this source as:

 

 

 

 

(1)