Difference between revisions of "CoE 161 S2 AY 2020-2021"

From Microlab Classes
Jump to navigation Jump to search
 
(23 intermediate revisions by 2 users not shown)
Line 39: Line 39:
 
| style="text-align:center;" | 1
 
| style="text-align:center;" | 1
 
|  
 
|  
[[Information measures]]
+
Information measures
* Basics of Information Theory
+
* What is information?
* Entropy
+
* [[Entropy]]
* The Gibbs Inequality
+
* [[Conditional Entropy and Mutual Information]]
 
|  
 
|  
 
* Understand the basis of Information Theory.
 
* Understand the basis of Information Theory.
* Appreciate the breath and implications of Information Theory.
+
* Appreciate the breadth and implications of Information Theory.
 
* Derive an expression for Entropy and its bounds.
 
* Derive an expression for Entropy and its bounds.
 
|
 
|
Line 55: Line 55:
 
| style="text-align:center;" | 2
 
| style="text-align:center;" | 2
 
|  
 
|  
[[The source coding problem]]
+
The source coding problem
* McMillan/Kraft Inequality
+
* [[Source coding]]
* Shannon's Noiseless Coding Theorem
+
* [[Uniquely Decodable Codes]]
 +
* [[Huffman codes]]
 
|  
 
|  
 +
* Decode a message compressed using a prefix-free code.
 
* Optimally encode redundant messages based on the entropy of the source.
 
* Optimally encode redundant messages based on the entropy of the source.
 
|
 
|
 
|
 
|
* 161-A2.2: Prefix-free codes
+
* [[161-A2.2]]: Prefix-free codes
* 161-A2.3: Huffman codes
+
* [[161-A2.3]]: Huffman codes
 
|-
 
|-
 
| style="text-align:center;" | 3
 
| style="text-align:center;" | 3
 
|  
 
|  
[[The data-processing inequality]]
+
The data-processing inequality
* Conditional Entropy
+
* [[Independence and Markov Chains]]
* Joint Entropy
+
* [[Nonnegativity of Information Measures]]
* Channel Capacity
+
* The Data-Processing Inequality
 
|  
 
|  
 
* Understand the role of mutual information in noisy channels.
 
* Understand the role of mutual information in noisy channels.
 +
* Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
 
|
 
|
 
|
 
|
* 161-A3.2: Data-processing inequality
+
* [[161-A3.2]]: Data-processing inequality
 
|-
 
|-
 
| style="text-align:center;" | 4
 
| style="text-align:center;" | 4
 
|  
 
|  
[[The Asymptotic Equipartition Property (AEP)]]
+
The Asymptotic Equipartition Property (AEP)
* Typical sets
+
* [[Asymptotic Equipartition Property]]
* Jointly typical sets
+
* [[Jointly typical sequences]]
* Shannon's random coding argument
+
* [[Channel coding theorem]]
 
|  
 
|  
* Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
+
* Understand the proof of Shannon's noisy channel coding theorem through joint typicality and the AEP
 
|
 
|
 
|
 
|
* 161-A4.2: Typical set decoding
+
* [[161-A4.2]]: Typical set decoding
 
|-
 
|-
 
| style="text-align:center;" | 5
 
| style="text-align:center;" | 5
Line 93: Line 96:
 
Polar codes
 
Polar codes
 
* Achieving capacity
 
* Achieving capacity
* Channel polarization
+
* [[Channel polarization]]
 
|  
 
|  
 
|
 
|
 
|
 
|
* 161-A5.1: Polar codes
+
* [[161-A5.1]]: Polar codes
 
|-
 
|-
 
| style="text-align:center;" | 6
 
| style="text-align:center;" | 6
 
|  
 
|  
 +
Turing Machines
 
|  
 
|  
 
|
 
|
Line 157: Line 161:
 
== References ==
 
== References ==
 
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006.  
 
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006.  
 +
* Yeung, R., ''Information Theory and Network Coding.'', Springer, 2008.
 
* Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013.  
 
* Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013.  
 
* Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA.
 
* Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA.

Latest revision as of 07:41, 7 May 2021

  • Introduction to Information and Complexity (2018 Curriculum)
    • Advanced course on information theory and computational complexity, starting from Shannon's information theory and Turing's theory of computation, leading to the theory of Kolmogorov complexity.
  • Semester Offered: 2nd semester
  • Course Credit: Lecture: 3 units

Prerequisites

  • EEE 111 (Introduction to Programming and Computation)
  • EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)

Course Goal

  • Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.

Specific Goals

  • Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
    • Information Theory
    • Computational Complexity Theory

Content

This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.

We want to answer the question: How good is my solution (e.g. algorithm, architecture, system, etc.) to a computer engineering problem?

  • Information Theory: data representation efficiency
    • What is information?
    • How do we measure information?
  • Computational Complexity: complexity in time and space
    • Complexity of algorithms
    • Complexity of objects/data

Syllabus

Module Topics Outcomes Resources Activities
1

Information measures

  • Understand the basis of Information Theory.
  • Appreciate the breadth and implications of Information Theory.
  • Derive an expression for Entropy and its bounds.

Probability Review I

2

The source coding problem

  • Decode a message compressed using a prefix-free code.
  • Optimally encode redundant messages based on the entropy of the source.
3

The data-processing inequality

  • Understand the role of mutual information in noisy channels.
  • Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
4

The Asymptotic Equipartition Property (AEP)

  • Understand the proof of Shannon's noisy channel coding theorem through joint typicality and the AEP
5

Polar codes

6

Turing Machines

7
8
9
10
11
12
13
14

References

  • Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006.
  • Yeung, R., Information Theory and Network Coding., Springer, 2008.
  • Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013.
  • Cristopher Moore and Stephan Mertens, The Nature of Computation, Oxford University Press, Inc., 2011, USA.

Additional Reading Materials

  • Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA.
  • Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts.
  • Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
  • Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013.