Difference between revisions of "161-A1.1"
Jump to navigation
Jump to search
Line 25: | Line 25: | ||
|<math>\frac{1}{2}</math> | |<math>\frac{1}{2}</math> | ||
|<math>\log_2\left(2\right)=1\,\mathrm{bit}</math> | |<math>\log_2\left(2\right)=1\,\mathrm{bit}</math> | ||
+ | |- | ||
+ | | You rolled a seven on rolling a pair of dice. | ||
+ | |<math>\frac{6}{36}</math> | ||
+ | |<math>\log_2\left(6\right)=2.58\,\mathrm{bits}</math> | ||
+ | |- | ||
+ | | Winning the Ultra Lotto 6/58 jackpot. | ||
+ | |<math>\frac{1}{40400000}</math> | ||
+ | |<math>\log_2\left(40400000\right)=25.27\,\mathrm{bits}</math> | ||
|- | |- | ||
|} | |} |
Revision as of 00:14, 14 September 2020
Let's look at a few applications of the concept of information and entropy.
Surprise! The Unexpected Observation
Information can be thought of as the amount of surprise at seeing an event. Note that a highly probable outcome is not surprising. Consider the following events:
Event | Probability | Information (Surprise) |
---|---|---|
Someone tells you . | ||
You got the wrong answer on a 4-choice multiple choice question. | ||
You guessed correctly on a 4-choice multiple choice question. | ||
You got the correct answer in a True or False question. | ||
You rolled a seven on rolling a pair of dice. | ||
Winning the Ultra Lotto 6/58 jackpot. |
Student Grading
How much information can we get from a single grade? Note that the maximum entropy occurs when all the grades have equal probability.
- For Pass/Fail grades, the possible outcomes are: with probabilities . Thus,
-
(1)
-
- For grades = with probabilities , we get:
-
(2)
-
- For grades = with probabilities , we have:
-
(3)
-
- If we have all the possible grades with probabilities , we have:
-
(4)
-