Difference between revisions of "CoE 161"

From Microlab Classes
Jump to navigation Jump to search
 
(29 intermediate revisions by 2 users not shown)
Line 41: Line 41:
 
[[Introduction to CoE 161]]
 
[[Introduction to CoE 161]]
 
* Basics of Information Theory
 
* Basics of Information Theory
* Entropy
+
* Information Measures
* The Gibbs Inequality
+
* Law of Large Numbers
 
|  
 
|  
* Understand the basis of Information Theory
+
* Understand the basis of Information Theory.
* Appreciate the breath and implications of Information Theory
+
* Appreciate the breadth and implications of Information Theory.
 +
* Derive an expression for Entropy and its bounds.
 
|
 
|
 
[[Probability Review I]]
 
[[Probability Review I]]
 
|
 
|
 +
* [[161-A1.1]]: Who is Claude Shannon?
 
|-
 
|-
 
| style="text-align:center;" | 2
 
| style="text-align:center;" | 2
Line 55: Line 57:
 
* McMillan/Kraft Inequality
 
* McMillan/Kraft Inequality
 
* Shannon's Noiseless Coding Theorem
 
* Shannon's Noiseless Coding Theorem
* Mutual Information
 
* Channel Capacity
 
* Analog Channels
 
 
|  
 
|  
 +
* Optimally encode redundant messages based on the entropy of the source.
 
|
 
|
 
|
 
|
 +
* [[161-A2.1]]: Source Coding
 
|-
 
|-
 
| style="text-align:center;" | 3
 
| style="text-align:center;" | 3
 
|  
 
|  
[[Entropy, Relative Entropy, Mutual Information]]
+
[[Mutual Information]]
 +
* Conditional Entropy
 +
* Joint Entropy
 +
* Channel Capacity
 
|  
 
|  
 +
* Understand the role of mutual information in noisy channels.
 
|
 
|
 
|
 
|
 +
* [[161-A3.1]]: Mutual Information and Channel Capacity
 
|-
 
|-
 
| style="text-align:center;" | 4
 
| style="text-align:center;" | 4
 
|  
 
|  
 +
[[The Data Processing Inequality]]
 +
* Chain Rules
 +
* Markov Chains
 +
* Sufficient Statistics
 +
* Fano's Inequality
 
|  
 
|  
 +
* Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
 
|
 
|
 
|
 
|
 +
* 161-A4.1: The Data Processing Inequality and Fano's Inequality
 
|-
 
|-
 
| style="text-align:center;" | 5
 
| style="text-align:center;" | 5
 
|  
 
|  
 +
Asymptotic Equipartition Property (AEP)
 +
* Law of Large Numbers
 +
* The Typical Set
 +
* Consequences of AEP
 
|  
 
|  
 
|
 
|
Line 138: Line 155:
  
 
== References ==
 
== References ==
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006. ([https://drive.google.com/file/d/1JE6ggtVm8NVWvFNfoC2ZAA09Juta4Rpu/view?usp=sharing pdf])
+
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006.  
* Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013. ([https://drive.google.com/file/d/1pqJdet2Yw5CqyzVts65b3qNzVaM0tnM3/view?usp=sharing pdf])
+
* Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013.  
 
* Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA.
 
* Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA.
  
 
== Additional Reading Materials ==
 
== Additional Reading Materials ==
*  Sanjeev Arora and Boaz Barak. (2009), ''Computational Complexity: A Modern Approach (1st ed.)'', Cambridge University Press, New York, NY, USA. ([https://drive.google.com/file/d/18TOE8ezb8jnbh0r51uqxQV3iRl8oLMwZ/view?usp=sharing pdf])
+
*  Sanjeev Arora and Boaz Barak. (2009), ''Computational Complexity: A Modern Approach (1st ed.)'', Cambridge University Press, New York, NY, USA.  
* Jones, Neil D., ''Computability and Complexity: From a Programming Perspective'', 1997, The MIT Press, Cambridge, Massachusetts ([https://drive.google.com/file/d/1KJrtO6WdDCavUCo1G0r3eKk8s0jKfiw0/view?usp=sharing pdf])
+
* Jones, Neil D., ''Computability and Complexity: From a Programming Perspective'', 1997, The MIT Press, Cambridge, Massachusetts.  
* Jon Kleinberg and Christos Papadimitriou, ''Computability and Complexity'', Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.([https://drive.google.com/file/d/1jxrCWW-0hx0kiCJKAvux9_41zxpePtK7/view?usp=sharing pdf])
+
* Jon Kleinberg and Christos Papadimitriou, ''Computability and Complexity'', Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
* Robert M. Gray, ''Entropy and Information Theory 1st ed. (corrected)'', Springer-Verlag New York 2013. ([https://drive.google.com/file/d/1uZd3A05sXzQfM6MU2x4ZR6QHAKZUvN7G/view?usp=sharing pdf])
+
* Robert M. Gray, ''Entropy and Information Theory 1st ed. (corrected)'', Springer-Verlag New York 2013.

Latest revision as of 11:11, 2 March 2021

  • Introduction to Information and Complexity (2018 Curriculum)
    • Advanced course on information theory and computational complexity, starting from Shannon's information theory and Turing's theory of computation, leading to the theory of Kolmogorov complexity.
  • Semester Offered: 2nd semester
  • Course Credit: Lecture: 3 units

Prerequisites

  • EEE 111 (Introduction to Programming and Computation)
  • EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)

Course Goal

  • Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.

Specific Goals

  • Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
    • Information Theory
    • Computational Complexity Theory

Content

This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.

We want to answer the question: How good is my solution (e.g. algorithm, architecture, system, etc.) to a computer engineering problem?

  • Information Theory: data representation efficiency
    • What is information?
    • How do we measure information?
  • Computational Complexity: complexity in time and space
    • Complexity of algorithms
    • Complexity of objects/data

Syllabus

Module Topics Outcomes Resources Activities
1

Introduction to CoE 161

  • Basics of Information Theory
  • Information Measures
  • Law of Large Numbers
  • Understand the basis of Information Theory.
  • Appreciate the breadth and implications of Information Theory.
  • Derive an expression for Entropy and its bounds.

Probability Review I

2

Shannon's Communication Theory

  • McMillan/Kraft Inequality
  • Shannon's Noiseless Coding Theorem
  • Optimally encode redundant messages based on the entropy of the source.
3

Mutual Information

  • Conditional Entropy
  • Joint Entropy
  • Channel Capacity
  • Understand the role of mutual information in noisy channels.
  • 161-A3.1: Mutual Information and Channel Capacity
4

The Data Processing Inequality

  • Chain Rules
  • Markov Chains
  • Sufficient Statistics
  • Fano's Inequality
  • Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
  • 161-A4.1: The Data Processing Inequality and Fano's Inequality
5

Asymptotic Equipartition Property (AEP)

  • Law of Large Numbers
  • The Typical Set
  • Consequences of AEP
6
7
8
9
10
11
12
13
14

References

  • Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006.
  • Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013.
  • Cristopher Moore and Stephan Mertens, The Nature of Computation, Oxford University Press, Inc., 2011, USA.

Additional Reading Materials

  • Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA.
  • Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts.
  • Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
  • Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013.