Difference between revisions of "CoE 161 S2 AY 2020-2021"
Adrian Vidal (talk | contribs) |
Adrian Vidal (talk | contribs) |
||
(11 intermediate revisions by 2 users not shown) | |||
Line 71: | Line 71: | ||
The data-processing inequality | The data-processing inequality | ||
* [[Independence and Markov Chains]] | * [[Independence and Markov Chains]] | ||
+ | * [[Nonnegativity of Information Measures]] | ||
* The Data-Processing Inequality | * The Data-Processing Inequality | ||
− | |||
| | | | ||
* Understand the role of mutual information in noisy channels. | * Understand the role of mutual information in noisy channels. | ||
Line 78: | Line 78: | ||
| | | | ||
| | | | ||
− | * 161-A3.2: Data-processing inequality | + | * [[161-A3.2]]: Data-processing inequality |
|- | |- | ||
| style="text-align:center;" | 4 | | style="text-align:center;" | 4 | ||
| | | | ||
− | + | The Asymptotic Equipartition Property (AEP) | |
− | * | + | * [[Asymptotic Equipartition Property]] |
− | * | + | * [[Jointly typical sequences]] |
− | + | * [[Channel coding theorem]] | |
| | | | ||
* Understand the proof of Shannon's noisy channel coding theorem through joint typicality and the AEP | * Understand the proof of Shannon's noisy channel coding theorem through joint typicality and the AEP | ||
| | | | ||
| | | | ||
− | * 161-A4.2: Typical set decoding | + | * [[161-A4.2]]: Typical set decoding |
|- | |- | ||
| style="text-align:center;" | 5 | | style="text-align:center;" | 5 | ||
Line 96: | Line 96: | ||
Polar codes | Polar codes | ||
* Achieving capacity | * Achieving capacity | ||
− | * Channel polarization | + | * [[Channel polarization]] |
| | | | ||
| | | | ||
| | | | ||
− | * 161-A5.1: Polar codes | + | * [[161-A5.1]]: Polar codes |
|- | |- | ||
| style="text-align:center;" | 6 | | style="text-align:center;" | 6 | ||
| | | | ||
+ | Turing Machines | ||
| | | | ||
| | | | ||
Line 160: | Line 161: | ||
== References == | == References == | ||
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006. | * Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006. | ||
+ | * Yeung, R., ''Information Theory and Network Coding.'', Springer, 2008. | ||
* Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013. | * Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013. | ||
* Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA. | * Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA. |
Latest revision as of 07:41, 7 May 2021
- Introduction to Information and Complexity (2018 Curriculum)
- Advanced course on information theory and computational complexity, starting from Shannon's information theory and Turing's theory of computation, leading to the theory of Kolmogorov complexity.
- Semester Offered: 2nd semester
- Course Credit: Lecture: 3 units
Contents
Prerequisites
- EEE 111 (Introduction to Programming and Computation)
- EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)
Course Goal
- Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.
Specific Goals
- Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
- Information Theory
- Computational Complexity Theory
Content
This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.
We want to answer the question: How good is my solution (e.g. algorithm, architecture, system, etc.) to a computer engineering problem?
- Information Theory: data representation efficiency
- What is information?
- How do we measure information?
- Computational Complexity: complexity in time and space
- Complexity of algorithms
- Complexity of objects/data
Syllabus
Module | Topics | Outcomes | Resources | Activities |
---|---|---|---|---|
1 |
Information measures
|
|
||
2 |
The source coding problem |
|
||
3 |
The data-processing inequality
|
|
| |
4 |
The Asymptotic Equipartition Property (AEP) |
|
| |
5 |
Polar codes
|
| ||
6 |
Turing Machines |
|||
7 | ||||
8 | ||||
9 | ||||
10 | ||||
11 | ||||
12 | ||||
13 | ||||
14 |
References
- Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006.
- Yeung, R., Information Theory and Network Coding., Springer, 2008.
- Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013.
- Cristopher Moore and Stephan Mertens, The Nature of Computation, Oxford University Press, Inc., 2011, USA.
Additional Reading Materials
- Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA.
- Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts.
- Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
- Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013.