Difference between revisions of "161-A1.1"
Jump to navigation
Jump to search
Line 11: | Line 11: | ||
|- | |- | ||
|Someone tells you <math>1=1</math>. | |Someone tells you <math>1=1</math>. | ||
− | |1 | + | |<math>1</math> |
− | |0 | + | |<math>\log_2\left(1\right) = 0</math> |
|- | |- | ||
| You got the wrong answer on a 4-choice multiple choice question. | | You got the wrong answer on a 4-choice multiple choice question. |
Revision as of 00:06, 14 September 2020
Let's look at a few applications of the concept of information and entropy.
Surprise! The Unexpected Observation
Information can be thought of as the amount of surprise at seeing an event. Note that a highly probable outcome is not surprising. Consider the following events:
Event | Probability | Information (Surprise) |
---|---|---|
Someone tells you . | ||
You got the wrong answer on a 4-choice multiple choice question. | ||
You got the correct answer in a True or False question. |
Student Grading
How much information can we get from a single grade? Note that the maximum information occurs when all the grades have equal probability.
- For Pass/Fail grades, the possible outcomes are: with probabilities . Thus,
-
(1)
-
- For grades = with probabilities , we get:
-
(2)
-
- For grades = with probabilities , we have:
-
(3)
-
- If we have all the possible grades with probabilities , we have:
-
(4)
-