Difference between revisions of "161-A1.1"

From Microlab Classes
Jump to navigation Jump to search
 
(8 intermediate revisions by the same user not shown)
Line 1: Line 1:
Let's look at a few applications of the concept of information and entropy.
+
* Activity: '''Who is Claude Shannon?'''
 +
* '''Instructions:''' In this activity, you are tasked to
 +
** Read two articles on Claude Shannon.
 +
** Write a short (1-page) report.
 +
* Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.  
  
== Surprise! The Unexpected Observation ==
+
== Articles ==
Information can be thought of as the amount of ''surprise'' at seeing an event. Note that a highly probable outcome is not surprising. Consider the following events:
+
* John Horgan, ''Claude Shannon: Tinkerer, Prankster, and Father of Information Theory'', IEEE Spectrum, 2016 ([https://spectrum.ieee.org/tech-history/cyberspace/claude-shannon-tinkerer-prankster-and-father-of-information-theory link])
 +
* Graham P. Collins, ''Claude E. Shannon: Founder of Information Theory'', Scientific American, October 14, 2002 ([https://www.scientificamerican.com/article/claude-e-shannon-founder/ link])
  
{| class="wikitable"
+
== Report Guide ==
|-
+
# Based on the the papers (not limited to the two papers above) you have read, and any other resource that is available to you, write a short (1-page) report on one possible application of information theory in the field of computer engineering that '''you''' are interested in.
! Event
+
# Answer yet another ball and urn problem: Given an urn with 4 red balls, 3 green balls, and 2 yellow balls:
! Probability
+
#* How much information (in bits) do you get when you choose a (i) red ball, (ii) green ball, (iii) or a yellow ball.
! Information (Surprise)
+
#* What is the entropy (or average information) you get (in bits) when you choose a ball from the bin?
|-
 
|Someone tells you <math>1=1</math>.
 
|1  
 
|0
 
|-
 
| You got the wrong answer on a 4-choice multiple choice question.
 
|<math>\frac{3}{4}</math>
 
|<math>\log_2\left(\frac{4}{3}\right)=0.415\,\mathrm{bits}</math>
 
|-
 
| You got the correct answer in a True or False question.
 
|<math>\frac{1}{2}</math>
 
|<math>\log_2\left(2\right)=1\,\mathrm{bit}</math>
 
|-
 
|}
 
  
== Student Grading ==
+
=== Submission ===
How much information can we get from a single grade? Note that the maximum information occurs when all the grades have equal probability.
+
Submit your report via email before proceeding to Module 2.
* For Pass/Fail grades, the possible outcomes are: <math>\{\mathrm{P}, \mathrm{F}\}</math> with probabilities <math>\{\tfrac{1}{2}, \tfrac{1}{2}\}</math>. Thus,
 
 
 
{{NumBlk|::|<math>H\left(P\right)=\sum_{i=1}^n p_i\cdot \log_2\left(\frac{1}{p_i}\right) = \frac{1}{2}\cdot \log_2\left(2\right) + \frac{1}{2}\cdot \log_2\left(2\right) = 1\,\mathrm{bit}</math>|{{EquationRef|1}}}}
 
 
 
* For grades = <math>\{1.00, 2.00, 3.00, 4.00, 5.00\}</math> with probabilities <math>\{\tfrac{1}{5}, \tfrac{1}{5}, \tfrac{1}{5}, \tfrac{1}{5}, \tfrac{1}{5}\}</math>, we get:
 
 
 
{{NumBlk|::|<math>H\left(P\right)=\sum_{i=1}^n p_i\cdot \log_2\left(\frac{1}{p_i}\right) = 5\cdot \frac{1}{5}\cdot \log_2\left(5\right) = 2.32\,\mathrm{bits}</math>|{{EquationRef|2}}}}
 
 
 
* For grades = <math>\{1.00, 1.50, 2.00, 2.50, 3.00, 4.00, 5.00\}</math> with probabilities <math>\{\tfrac{1}{7}, \tfrac{1}{7}, \tfrac{1}{7}, \tfrac{1}{7}, \tfrac{1}{7}, \tfrac{1}{7}, \tfrac{1}{7}\}</math>, we have:
 
 
 
{{NumBlk|::|<math>H\left(P\right)=\sum_{i=1}^n p_i\cdot \log_2\left(\frac{1}{p_i}\right) = 7\cdot \frac{1}{7}\cdot \log_2\left(7\right) = 2.81\,\mathrm{bits}</math>|{{EquationRef|3}}}}
 
 
 
* If we have all the possible grades <math>\{1.00, 1.25, 1.50, 1.75, 2.00, 2.25, 2.50, 2.75, 3.00, 4.00, 5.00, \mathrm{INC}, \mathrm{DRP}, \mathrm{LOA}\}</math> with probabilities <math>\{\tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}, \tfrac{1}{14}\}</math>, we have:
 
 
 
{{NumBlk|::|<math>H\left(P\right)=\sum_{i=1}^n p_i\cdot \log_2\left(\frac{1}{p_i}\right) = 14\cdot \frac{1}{14}\cdot \log_2\left(14\right) = 3.81\,\mathrm{bits}</math>|{{EquationRef|4}}}}
 

Latest revision as of 08:59, 14 September 2020

  • Activity: Who is Claude Shannon?
  • Instructions: In this activity, you are tasked to
    • Read two articles on Claude Shannon.
    • Write a short (1-page) report.
  • Should you have any questions, clarifications, or issues, please contact your instructor as soon as possible.

Articles

  • John Horgan, Claude Shannon: Tinkerer, Prankster, and Father of Information Theory, IEEE Spectrum, 2016 (link)
  • Graham P. Collins, Claude E. Shannon: Founder of Information Theory, Scientific American, October 14, 2002 (link)

Report Guide

  1. Based on the the papers (not limited to the two papers above) you have read, and any other resource that is available to you, write a short (1-page) report on one possible application of information theory in the field of computer engineering that you are interested in.
  2. Answer yet another ball and urn problem: Given an urn with 4 red balls, 3 green balls, and 2 yellow balls:
    • How much information (in bits) do you get when you choose a (i) red ball, (ii) green ball, (iii) or a yellow ball.
    • What is the entropy (or average information) you get (in bits) when you choose a ball from the bin?

Submission

Submit your report via email before proceeding to Module 2.