Let's look at a few applications of the concept of information and entropy.
Surprise! The Unexpected Observation
Information can be thought of as the amount of surprise at seeing an event. Note that a highly probable outcome is not surprising. Consider the following events:
Event
|
Probability
|
Information (Surprise)
|
Someone tells you .
|
1
|
0
|
You got the wrong answer on a 4-choice multiple choice question.
|
|
|
You got the correct answer in a True or False question.
|
|
|
Student Grading
How much information can we get from a single grade? Note that the maximum information occurs when all the grades have equal probability.
- For Pass/Fail grades, the possible outcomes are: with probabilities . Thus,
-
|
|
(1)
|
- For grades = with probabilities , we get:
-
|
|
(2)
|
- For grades = with probabilities , we have:
-
|
|
(3)
|
- If we have all the possible grades with probabilities , we have:
-
|
|
(4)
|