Difference between revisions of "Introduction to CoE 161"
Line 41: | Line 41: | ||
|- | |- | ||
| Information is a non-negative quantity. | | Information is a non-negative quantity. | ||
− | | <math>I\left | + | | <math>I\left(p\right) \geq 0</math> |
|- | |- | ||
|} | |} |
Revision as of 15:17, 9 September 2020
Welcome to CoE 161 / CoE 197!
Since we are offering this class remotely, there will be many changes to our normal course delivery:
- There will be no face-to-face lecture classes. All the material will be made available via this site.
- There will be more emphasis on student-centric activities, e.g. analysis, design, and simulations. Thus, you will be mostly "learning by doing". In this context, we will set aside an hour every week for consultations and questions via video-conferencing.
- Grades will be based on the submitted deliverables from the activities. Though we will not be very strict regarding the deadlines, it is a good idea to keep up with the class schedule and avoid cramming later in the semester.
Let's get started!
Contents
Measuring Complexity
For any system we are studying, designing, or building, an interesting and important question is usually: "How complex is this system?" or "How complex should this system be?". In general, we want to be able to compare two systems, and be able to say that one system is more complex than the other. In this case, having a numerical metric would be very convenient.
Possible ways of measuring complexity (obviously not a complete list!):
- Human observation, thus making any kind of rating subjective.
- Complexity based on the number of parts, components, or distinct elements. However, this depends on the definition of what is a "part". This is dependent on the observer's scale, from functional components down to atoms at the extreme case, as shown in Fig. 1.
- Number of dimensions. This is sometimes hard to measure, e.g. degrees of freedom, etc.
- Number of parameters controlling the system.
- The minimal description needed. However, we need to figure out which language we need to use.
- Information content. We then need to figure out how to define and measure information.
- Minimum generator or constructor (to build the system). We need to define what tools and methods we need to use to build the system.
- Minimum energy and/or time to construct the system. However, some systems evolve with time, so defining the beginning and end points can be difficult.
All of these measures are associated with a model of the system in question. Thus observers could use different models, and therefore come up with different notions of the complexity of the said system. We do not expect to come up with a single universal measure of complexity, but we can explore a framework that is optimal for (a) a particular observer, (b) a particular context, and (c) a particular purpose. Let us focus on the measures related to how unexpected an observation is, an approach known as Information Theory.
Some of the ideas and concepts that follow are based on basic probability ideas. You can review them here: Probability Review I.
Basics of Information Theory
We want to create a metric for measuring the information we get from observing the occurrence of an event that has a probability . Let us ignore all other features of the event, except whether the event occurred or not. We can then think of the event as seeing if a symbol, whose probability of occurring is , appears. Thus, our definition of information is in terms of the probability .
An Axiomatic Approach
Let us define our information measure as . We want to have several properties:
Axiom | Property |
---|---|
Information is a non-negative quantity. |