2S2122 Activity 2.1

From Microlab Classes
Jump to navigation Jump to search

Instructions

  • Answer the following problems individually and truthfully.
  • Be sure to show your solutions and please box your final answers.
  • Please write your complete name, student number, and section on the upper left corner of your answer sheet. No name, student number, and section, no grade.
  • Save your answers in pdf file type with the filename format "section_lastname_firstname_studentnumber.pdf" all in small caps. For example: "abc_wayne_bruce_201101474.pdf".
  • Submit your files in the respective submission bin in UVLE. Be sure to submit in the correct class!
  • Have fun doing these exercises :) even though it may seem boring.
  • You only have two weeks to work on the activities. We will post the hard deadline on UVLE.
  • You might want to create your own programs that calculate information and entropy.

Grading Rubrics

  • If you have a good solution and a correct answer, you get full points.
  • If you have a good solution but the answer is not boxed (or highlighted), you get a 5% deduction of the total points for that problem.
  • If you have a good solution but the answer is wrong, you get a 20% deduction of the total points for that problem.
  • If your solution is somewhat OK but incomplete. You only get 40% of the total problem.
  • If you have a bad solution but with a correct answer. That sounds suspicious. You get 0% for that problem. A bad solution may be:
    • You just wrote the given.
    • You just dumped equations but no explanation to where they are used.
    • You attempted to put a messy flow to distract us. That's definitely bad.
  • No attempt at all means no points at all.
  • Some problems here may be too long to write. You are allowed to write the general equation instead but be sure to indicate what it means.
  • Make sure that you use bits as our units of information.

Problem 1 (1 pts.)

A word in a code consists of five binary digits (e.g., is one code word). Each digit is chosen independently of the others and the probability of any particular digit being a 1 is . Find the information associated with the following events:

(a) At least three 1s. (0.2 pts.)

(b) At most four 1s. (0.2 pts.)

(c) Exactly two 0s. (0.2 pts.)

(d) Let be a random variable representing the sum of 1s for the code with five binary digits. Calculate . (0.2 pts.)

(e) Interpret your results in (d) to (a), (b), and (c). Philosophize what you think of it. (0.2 pts.)

Problem 2 (2 pts.)

The popular table top game Dungeons and Dragons uses 6 types of dies. We denote each die as where is an -sided die. For example, a is a 4-sided die. Each face of a die is fair (i.e., none of the faces are biased). Calculate and answer the following:

(a) (0.2 pts.)

(b) (0.2 pts.)

(c) (0.2 pts.)

(d) (0.2 pts.)

(e) (0.2 pts.)

(f) (0.2 pts.)

(g) Comment or interpret how does information vary per die? Is it consistent with your belief of uncertainty? (0.4 pts.)

(h) A potion of greater healing heals a character with the formula . That means we need to roll a die four times and add a to the sum of the rolls. How surprising is it to get . HP stands for hit-points or the life of a character. (Hint: Careful! 🙂) (0.4 pts.)

(i) (Bonus) Which is better? To perceive events as a chance or an uncertainty? If we like your answer you get +1 for the entire exercise.

Problem 3 (2 pts.)

and are Bernoulli (or binary) random variables with the distribution of having (i.e., has a chance of being a 1). We are also given the conditional probabilities:

Calculate the following:

(a) (0.4 pts.)

(b) (0.4 pts.)

(c) (0.4 pts.)

(d) (0.4 pts.)

(e) (0.4 pts.)

Problem 4 (2 pts.)

We have three urns , , and , which contain colored balls as follows:

  • has three red and five green balls.
  • has one red and two green balls.
  • has seven red and six green balls.

Let be the random variable representing the three urns . Let be the random variable representing . Where represents "a red ball was chosen" and represents "a green ball was chosen". Determine the following:

(a) (1/3 pts.)

(b) (1/3 pts.)

(c) (1/3 pts.)

(d) (1/3 pts.)

(e) (1/3 pts.)

(f) (1/3 pts.)

Problem 5 (3 pts.)

Process writing is where we re-write or re-derive concepts that were taught to us. It is another form of improving our understanding of a concept: "reteaching what was taught ... " [1]. For this problem, rederive the bounds of entropy. Suppose we are given some random variable we were taught that . Show and explain in your own words how:

  • (0.5 pts.)
  • (2.5 pts.)

References

  1. Sousa, David A. 2006. How the Brain Learns. Thousand Oaks, Calif: Corwin Press.