Difference between revisions of "CoE 161 S2 AY 2022-2023"
(Created page with "* '''Introduction to Information and Complexity''' ** Introductory course on information theory and computational complexity. We'll start from Shannon's information theory and...") |
|||
(3 intermediate revisions by the same user not shown) | |||
Line 25: | Line 25: | ||
** Complexity of algorithms | ** Complexity of algorithms | ||
** Complexity of objects/data | ** Complexity of objects/data | ||
+ | |||
+ | == Instructor Details == | ||
+ | * Louis Alarcon, Ph.D. | ||
+ | ** Email: louis.alarcon@eee.upd.edu.ph | ||
+ | |||
+ | == Class Schedule == | ||
+ | * Section WFQ - Wednesdays and Fridays, 7:00am - 8:30am | ||
+ | * Section WFR - Wednesdays and Fridays, 8:30am - 10:00am | ||
+ | |||
+ | == Syllabus == | ||
+ | {| class="wikitable" | ||
+ | |- | ||
+ | ! scope="col"| Module | ||
+ | ! scope="col"| Topics | ||
+ | ! scope="col"| Outcomes | ||
+ | ! scope="col"| Resources | ||
+ | ! scope="col"| Activities | ||
+ | |- | ||
+ | | style="text-align:center;" | 1 | ||
+ | | | ||
+ | '''Introduction to Information Theory''' | ||
+ | * [[Engaging introduction to information theory]] | ||
+ | * [[Probability review for warm up]] | ||
+ | | | ||
+ | * Understand the basis of information theory. | ||
+ | * Appreciate the breadth and implications of information theory. | ||
+ | * Quick review about probability. | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | style="text-align:center;" | 2 | ||
+ | | | ||
+ | '''Mathematical Fundamentals of Information Theory''' | ||
+ | * [[Information and Entropy]] | ||
+ | * [[Joint Entropy, Conditional Entropy, and Mutual Information]] | ||
+ | | | ||
+ | * Understand the basis of Information Theory. | ||
+ | * Appreciate the breadth and implications of Information Theory. | ||
+ | * Derive an expression for Entropy and its bounds. | ||
+ | * Apply these to some theoretical exercises. | ||
+ | | | ||
+ | * [https://www.youtube.com/watch?v=v68zYyaEmEA&t=1631s&ab_channel=3Blue1Brown Information theory and Wordle] | ||
+ | | | ||
+ | |- | ||
+ | | style="text-align:center;" | 3 | ||
+ | | | ||
+ | '''Channels and Channel Capacity (2 weeks)''' | ||
+ | * [[Channeling Your Inner Capacity]] | ||
+ | * [[Coding teaser: Source Coding]] ([https://uvle.upd.edu.ph/mod/resource/view.php?id=221489 pdf backup]) | ||
+ | | | ||
+ | * Understand the role of mutual information in noisy channels. | ||
+ | * Observe how conditional entropy and mutual information measure noise. | ||
+ | * Learn about Huffman coding | ||
+ | | | ||
+ | | | ||
+ | *[[CoE161 2S2223 Programming Activity 1| Programming Activity 1]] | ||
+ | *[[CoE161 2S2223 Programming Activity 2| Programming Activity 2]] | ||
+ | |- | ||
+ | | style="text-align:center;" | 4 | ||
+ | | | ||
+ | '''Coding Theory''' | ||
+ | * [https://uvle.upd.edu.ph/mod/resource/view.php?id=225137 Coding Theories] | ||
+ | * [https://uvle.upd.edu.ph/mod/resource/view.php?id=225138 Error Correcting Codes] | ||
+ | | | ||
+ | * Learn new definitions about coding | ||
+ | * Learn about Kraft-McMillan inequality | ||
+ | * Learn about error correcting codes ECC | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | style="text-align:center;" | 5 | ||
+ | | | ||
+ | '''Hyperdimensional Computing''' | ||
+ | * [https://uvle.upd.edu.ph/mod/resource/view.php?id=229252 Data Compression] | ||
+ | * [https://uvle.upd.edu.ph/mod/resource/view.php?id=229253 Hyperdimensional Computing] | ||
+ | | | ||
+ | * Learn more about data compression | ||
+ | * Learn how hyperdimensional computing "compresses" a data set | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | style="text-align:center;" | 6 | ||
+ | | | ||
+ | '''Turing Machines (2 weeks)''' | ||
+ | * What are Turing machines? | ||
+ | * Decidability | ||
+ | * Reducibility | ||
+ | | | ||
+ | * Understand fundamental mathematical properties of computer hardware and software. | ||
+ | * Determine what can and cannot be computed. | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | style="text-align:center;" | 7 | ||
+ | | | ||
+ | '''Kolmogorov Theory of Complexity (2 weeks)''' | ||
+ | * Time complexity | ||
+ | * Space Complexity | ||
+ | * Intractability | ||
+ | | | ||
+ | * Investigate if time, memory, or other computing resources can solve computational problems. | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | |} | ||
+ | |||
+ | == References == | ||
+ | * Stone, J.V. , ''Information Theory: A Tutorial Introduction'', Sebtel Press, 2015. | ||
+ | * Michael Sipser, ''Introduction to the Theory of Computation'', 3rd edition, Cengage Learning, 2013. | ||
+ | * Cristopher Moore and Stephan Mertens, ''The Nature of Computation'', Oxford University Press, Inc., 2011, USA. | ||
+ | * Applebaum, D. , ''Probability and Information: An Integrated Approach'', Cambridge University Press, 2008. | ||
+ | * Yeung, R., ''Information Theory and Network Coding.'', Springer, 2008. | ||
+ | * Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006. | ||
+ | * Hankerson, D.R., Harris, G.A., Johnson, P.D. , ''Introduction to Information Theory and Data Compression'', CRC Press, 2003. | ||
+ | * MacKay, D. , ''Information Theory, Inference, and Learning Algorithms'', Cambridge University Press, 2003. | ||
+ | * Shannon, C. E., & Weaver, W., ''The mathematical theory of communication''. Urbana: University of Illinois Press. 1949. | ||
+ | |||
+ | == Additional Reading Materials == | ||
+ | * Robert M. Gray, ''Entropy and Information Theory 1st ed. (corrected)'', Springer-Verlag New York 2013. | ||
+ | * Sanjeev Arora and Boaz Barak. (2009), ''Computational Complexity: A Modern Approach (1st ed.)'', Cambridge University Press, New York, NY, USA. | ||
+ | * Jon Kleinberg and Christos Papadimitriou, ''Computability and Complexity'', Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004. | ||
+ | * Jones, Neil D., ''Computability and Complexity: From a Programming Perspective'', 1997, The MIT Press, Cambridge, Massachusetts. |
Latest revision as of 06:28, 5 May 2023
- Introduction to Information and Complexity
- Introductory course on information theory and computational complexity. We'll start from Shannon's information theory and Turing's theory of computation, then move to the theory of Kolmogorov complexity.
- Semester Offered: 2nd semester
- Course Credit: Lecture: 3 units
Contents
Prerequisites
- EEE 111 (Introduction to Programming and Computation)
- EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)
Course Goal
- Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.
- Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
- Information Theory
- Computational Complexity Theory
Content
This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.
We want to answer the question: How good is my solution (e.g. algorithm, architecture, system, etc.) to a computer engineering problem?
- Information Theory: data representation efficiency
- What is information?
- How do we measure information?
- Computational Complexity: complexity in time and space
- Complexity of algorithms
- Complexity of objects/data
Instructor Details
- Louis Alarcon, Ph.D.
- Email: louis.alarcon@eee.upd.edu.ph
Class Schedule
- Section WFQ - Wednesdays and Fridays, 7:00am - 8:30am
- Section WFR - Wednesdays and Fridays, 8:30am - 10:00am
Syllabus
Module | Topics | Outcomes | Resources | Activities |
---|---|---|---|---|
1 |
Introduction to Information Theory |
|
||
2 |
Mathematical Fundamentals of Information Theory |
|
||
3 |
Channels and Channel Capacity (2 weeks) |
|
||
4 |
Coding Theory |
|
||
5 |
Hyperdimensional Computing |
|
||
6 |
Turing Machines (2 weeks)
|
|
||
7 |
Kolmogorov Theory of Complexity (2 weeks)
|
|
References
- Stone, J.V. , Information Theory: A Tutorial Introduction, Sebtel Press, 2015.
- Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013.
- Cristopher Moore and Stephan Mertens, The Nature of Computation, Oxford University Press, Inc., 2011, USA.
- Applebaum, D. , Probability and Information: An Integrated Approach, Cambridge University Press, 2008.
- Yeung, R., Information Theory and Network Coding., Springer, 2008.
- Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006.
- Hankerson, D.R., Harris, G.A., Johnson, P.D. , Introduction to Information Theory and Data Compression, CRC Press, 2003.
- MacKay, D. , Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
- Shannon, C. E., & Weaver, W., The mathematical theory of communication. Urbana: University of Illinois Press. 1949.
Additional Reading Materials
- Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013.
- Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA.
- Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
- Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts.