Difference between revisions of "CoE 161 S2 AY 2021-2022"

From Microlab Classes
Jump to navigation Jump to search
(New page for 2S 2021-2022)
 
 
(26 intermediate revisions by one other user not shown)
Line 1: Line 1:
* '''Introduction to Information and Complexity''' (2018 Curriculum)
+
* '''Introduction to Information and Complexity'''
** Advanced course on information theory and computational complexity, starting from Shannon's information theory and Turing's theory of computation, leading to the theory of Kolmogorov complexity.
+
** Introductory course on information theory and computational complexity. We'll start from Shannon's information theory and Turing's theory of computation, then move to the theory of Kolmogorov complexity.
  
 
* Semester Offered: 2nd semester
 
* Semester Offered: 2nd semester
Line 11: Line 11:
 
== Course Goal ==
 
== Course Goal ==
 
* Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.
 
* Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.
 
=== Specific Goals ===
 
 
* Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
 
* Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
 
** Information Theory
 
** Information Theory
Line 27: Line 25:
 
** Complexity of algorithms  
 
** Complexity of algorithms  
 
** Complexity of objects/data
 
** Complexity of objects/data
 +
 +
== General Guidelines for AY 2021-2022 ==
 +
* Since we are offering this class remotely, there will be several changes to our normal course delivery:
 +
* There will be no face-to-face lecture classes. All materials will be made available via this site and on our UVLE page.
 +
* Please email our instructors for access to our UVLE page.
 +
* There will be more emphasis on student-centric activities, e.g. analysis, design, and simulations. Thus, you will be mostly "learning by doing". In this context, we will set aside an hour every week for consultations and questions via video-conferencing.
 +
* Grades will be based on the submitted deliverables from the activities. Though we will not be very strict regarding the deadlines, it is a good idea to keep up with the class schedule and avoid cramming later in the semester.
 +
 +
== Instructor Details ==
 +
* Louis Alarcon, Ph.D.
 +
** Email: louis.alarcon@eee.upd.edu.ph
 +
** Consultation time: TBD
 +
* Ryan Antonio
 +
** Email: ryan.albert.antonio@eee.upd.edu.ph
 +
** Consultation time: Mon, 9:00 AM - 4:00 PM; T-F 9:00 AM - 10:00 AM; Consulting beyond 4:00 PM is okay but may change depending on availability. Usually out every weekend.
  
 
== Syllabus ==
 
== Syllabus ==
Line 39: Line 52:
 
| style="text-align:center;" | 1
 
| style="text-align:center;" | 1
 
|  
 
|  
Information measures
+
'''Introduction to Information Theory'''
* What is information?
+
* [[Engaging introduction to information theory]]
* [[Entropy]]
+
* [[Probability review for warm up]]
* [[Conditional Entropy and Mutual Information]]
 
 
|  
 
|  
* Understand the basis of Information Theory.
+
* Understand the basis of information theory.
* Appreciate the breadth and implications of Information Theory.
+
* Appreciate the breadth and implications of information theory.
* Derive an expression for Entropy and its bounds.
+
* Quick review about probability.
 
|
 
|
[[Probability Review I]]
 
 
|
 
|
* [[161-A1.0]]: Warm-up Exercise
+
* [[2S2122 Activity 1.1]]: Probability exercises
* [[161-A1.2]]: Information measures
+
* [[2S2122 Activity 1.2(fixed)]]: First coding exercise
 
|-
 
|-
 
| style="text-align:center;" | 2
 
| style="text-align:center;" | 2
 
|  
 
|  
The source coding problem
+
'''Mathematical Fundamentals of Information Theory'''
* [[Source coding]]
+
* [[Information and Entropy]]
* [[Uniquely Decodable Codes]]
+
* [[Joint Entropy, Conditional Entropy, and Mutual Information]]
* [[Huffman codes]]
 
 
|  
 
|  
* Decode a message compressed using a prefix-free code.
+
* Understand the basis of Information Theory.
* Optimally encode redundant messages based on the entropy of the source.
+
* Appreciate the breadth and implications of Information Theory.
 +
* Derive an expression for Entropy and its bounds.
 +
* Apply these to some theoretical exercises.
 
|
 
|
 +
* [https://www.youtube.com/watch?v=v68zYyaEmEA&t=1631s&ab_channel=3Blue1Brown Information theory and Wordle]
 
|
 
|
* [[161-A2.2]]: Prefix-free codes
+
* [[2S2122 Activity 2.1]]: Theoretical Exercises
* [[161-A2.3]]: Huffman codes
+
* [[2S2122 Activity 2.2]]: How many bits is the English language?
 
|-
 
|-
 
| style="text-align:center;" | 3
 
| style="text-align:center;" | 3
 
|  
 
|  
The data-processing inequality
+
'''Channels and Channel Capacity (2 weeks)'''
* [[Independence and Markov Chains]]
+
* [[Channeling Your Inner Capacity]]
* [[Nonnegativity of Information Measures]]
+
* [[Coding teaser: Source Coding]] ([https://uvle.upd.edu.ph/mod/resource/view.php?id=221489 pdf backup])
* The Data-Processing Inequality
 
 
|  
 
|  
 
* Understand the role of mutual information in noisy channels.
 
* Understand the role of mutual information in noisy channels.
* Understand the implications of modeling a system using Markov chains, and from Fano's inequality, determine the bounds of the probability of error in these systems.
+
* Observe how conditional entropy and mutual information measure noise.
 +
* Learn about Huffman coding
 
|
 
|
 
|
 
|
* [[161-A3.2]]: Data-processing inequality
+
* [[2S2122 Activity 3.1]]: Theoretical Exercises
 +
* [[2S2122 Activity 3.2]]: Images and Noise
 +
* [[2S2122 Activity 3.3]]: Hidden Message
 
|-
 
|-
 
| style="text-align:center;" | 4
 
| style="text-align:center;" | 4
 
|  
 
|  
The Asymptotic Equipartition Property (AEP)
+
'''Coding Theory'''
* [[Asymptotic Equipartition Property]]
+
* [https://uvle.upd.edu.ph/mod/resource/view.php?id=225137 Coding Theories]
* [[Jointly typical sequences]]
+
* [https://uvle.upd.edu.ph/mod/resource/view.php?id=225138 Error Correcting Codes]
* [[Channel coding theorem]]
 
 
|  
 
|  
* Understand the proof of Shannon's noisy channel coding theorem through joint typicality and the AEP
+
* Learn new definitions about coding
 +
* Learn about Kraft-McMillan inequality
 +
* Learn about error correcting codes ECC
 
|
 
|
 
|
 
|
* [[161-A4.2]]: Typical set decoding
+
* [[2S2122 Activity 4.1]]: Theoretical Exercise
 +
* [[2S2122 Activity 4.2]]: Repetition Codes
 
|-
 
|-
| style="text-align:center;" | 5
+
| style="text-align:center;" | 5  
 
|  
 
|  
Polar codes
+
'''Hyperdimensional Computing'''
* Achieving capacity
+
* [https://uvle.upd.edu.ph/mod/resource/view.php?id=229252 Data Compression]
* [[Channel polarization]]
+
* [https://uvle.upd.edu.ph/mod/resource/view.php?id=229253 Hyperdimensional Computing]
 
|  
 
|  
 +
* Learn more about data compression
 +
* Learn how hyperdimensional computing "compresses" a data set
 
|
 
|
 
|
 
|
* [[161-A5.1]]: Polar codes
+
* [https://uvle.upd.edu.ph/mod/url/view.php?id=229254 2S2122 Activity 5.1] : HDC Model
 
|-
 
|-
 
| style="text-align:center;" | 6
 
| style="text-align:center;" | 6
 
|  
 
|  
Turing Machines
+
'''Turing Machines (2 weeks)'''
 +
* What are Turing machines?
 +
* Decidability
 +
* Reducibility
 
|  
 
|  
 +
* Understand fundamental mathematical properties of computer hardware and software.
 +
* Determine what can and cannot be computed.
 
|
 
|
 +
* Part 2 Lecture 1 ([https://drive.google.com/file/d/17C_3jFE8aBwPPM_krOlwZuK6IPMGKtIc/view?usp=sharing mp4])
 +
* Part 2 Lecture 2 ([https://drive.google.com/file/d/1FXmhFkSVRjIEY-ltYmErcZFmP4AijEGw/view?usp=sharing mp4])
 +
* Part 2 Lecture 3 ([https://drive.google.com/file/d/1HRhSYtwdwmC8yhYA1aqTbTl8VrKOlGcL/view?usp=sharing mp4])
 
|
 
|
 +
* [[2S2122 Activity 6.1]]: Theoretical Exercises
 +
* [[2S2122 Activity 6.2]]: The Turing Machine
 
|-
 
|-
 
| style="text-align:center;" | 7
 
| style="text-align:center;" | 7
 
|  
 
|  
 +
'''Kolmogorov Theory of Complexity (2 weeks)'''
 +
* Time complexity
 +
* Space Complexity
 +
* Intractability
 
|  
 
|  
 +
* Investigate if time, memory, or other computing resources can solve computational problems.
 
|
 
|
 +
* Part 2 Lecture 4 ([https://drive.google.com/file/d/1FMoAHGYEfqiny2IXvIJqfJD8pUnsZjfo/view?usp=sharing mp4])
 +
* Part 2 Lecture 5 ([https://drive.google.com/file/d/1Rr7ZALq0ZGZlM03U93Yrz2dmWpe8o3tL/view?usp=sharing mp4])
 +
* Part 6 Lecture 6 ([https://drive.google.com/file/d/1j_F4S3NpxB7m_1WCsoHAHzwVcoPtMjnA/view?usp=sharing mp4])
 
|
 
|
|-
+
* [[2S2122 Activity 6.1]]: Theoretical Exercises
| style="text-align:center;" | 8
+
* [[2S2122 Activity 6.2]]: Analyzing sorting algorithms
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 9
 
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 10
 
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 11
 
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 12
 
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 13
 
|
 
|
 
|
 
|
 
|-
 
| style="text-align:center;" | 14
 
|
 
|
 
|
 
|
 
 
|-
 
|-
 
|}
 
|}
Line 167: Line 164:
 
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006.  
 
* Cover, T. M, Thomas, J. A., ''Elements of Information Theory, 2ed.'', Wiley-Interscience, 2006.  
 
* Hankerson, D.R., Harris, G.A., Johnson, P.D. , ''Introduction to Information Theory and Data Compression'', CRC Press, 2003.  
 
* Hankerson, D.R., Harris, G.A., Johnson, P.D. , ''Introduction to Information Theory and Data Compression'', CRC Press, 2003.  
* MacKay, D. , ''Information Theory, Inference, and Learning Algorithms'', Cambridge University Press, 2003.  
+
* MacKay, D. , ''Information Theory, Inference, and Learning Algorithms'', Cambridge University Press, 2003.
 +
* Shannon, C. E., & Weaver, W., ''The mathematical theory of communication''. Urbana: University of Illinois Press. 1949.
  
 
== Additional Reading Materials ==
 
== Additional Reading Materials ==

Latest revision as of 08:59, 11 October 2022

  • Introduction to Information and Complexity
    • Introductory course on information theory and computational complexity. We'll start from Shannon's information theory and Turing's theory of computation, then move to the theory of Kolmogorov complexity.
  • Semester Offered: 2nd semester
  • Course Credit: Lecture: 3 units

Prerequisites

  • EEE 111 (Introduction to Programming and Computation)
  • EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)

Course Goal

  • Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.
  • Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
    • Information Theory
    • Computational Complexity Theory

Content

This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.

We want to answer the question: How good is my solution (e.g. algorithm, architecture, system, etc.) to a computer engineering problem?

  • Information Theory: data representation efficiency
    • What is information?
    • How do we measure information?
  • Computational Complexity: complexity in time and space
    • Complexity of algorithms
    • Complexity of objects/data

General Guidelines for AY 2021-2022

  • Since we are offering this class remotely, there will be several changes to our normal course delivery:
  • There will be no face-to-face lecture classes. All materials will be made available via this site and on our UVLE page.
  • Please email our instructors for access to our UVLE page.
  • There will be more emphasis on student-centric activities, e.g. analysis, design, and simulations. Thus, you will be mostly "learning by doing". In this context, we will set aside an hour every week for consultations and questions via video-conferencing.
  • Grades will be based on the submitted deliverables from the activities. Though we will not be very strict regarding the deadlines, it is a good idea to keep up with the class schedule and avoid cramming later in the semester.

Instructor Details

  • Louis Alarcon, Ph.D.
    • Email: louis.alarcon@eee.upd.edu.ph
    • Consultation time: TBD
  • Ryan Antonio
    • Email: ryan.albert.antonio@eee.upd.edu.ph
    • Consultation time: Mon, 9:00 AM - 4:00 PM; T-F 9:00 AM - 10:00 AM; Consulting beyond 4:00 PM is okay but may change depending on availability. Usually out every weekend.

Syllabus

Module Topics Outcomes Resources Activities
1

Introduction to Information Theory

  • Understand the basis of information theory.
  • Appreciate the breadth and implications of information theory.
  • Quick review about probability.
2

Mathematical Fundamentals of Information Theory

  • Understand the basis of Information Theory.
  • Appreciate the breadth and implications of Information Theory.
  • Derive an expression for Entropy and its bounds.
  • Apply these to some theoretical exercises.
3

Channels and Channel Capacity (2 weeks)

  • Understand the role of mutual information in noisy channels.
  • Observe how conditional entropy and mutual information measure noise.
  • Learn about Huffman coding
4

Coding Theory

  • Learn new definitions about coding
  • Learn about Kraft-McMillan inequality
  • Learn about error correcting codes ECC
5

Hyperdimensional Computing

  • Learn more about data compression
  • Learn how hyperdimensional computing "compresses" a data set
6

Turing Machines (2 weeks)

  • What are Turing machines?
  • Decidability
  • Reducibility
  • Understand fundamental mathematical properties of computer hardware and software.
  • Determine what can and cannot be computed.
  • Part 2 Lecture 1 (mp4)
  • Part 2 Lecture 2 (mp4)
  • Part 2 Lecture 3 (mp4)
7

Kolmogorov Theory of Complexity (2 weeks)

  • Time complexity
  • Space Complexity
  • Intractability
  • Investigate if time, memory, or other computing resources can solve computational problems.
  • Part 2 Lecture 4 (mp4)
  • Part 2 Lecture 5 (mp4)
  • Part 6 Lecture 6 (mp4)

References

  • Stone, J.V. , Information Theory: A Tutorial Introduction, Sebtel Press, 2015.
  • Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013.
  • Cristopher Moore and Stephan Mertens, The Nature of Computation, Oxford University Press, Inc., 2011, USA.
  • Applebaum, D. , Probability and Information: An Integrated Approach, Cambridge University Press, 2008.
  • Yeung, R., Information Theory and Network Coding., Springer, 2008.
  • Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006.
  • Hankerson, D.R., Harris, G.A., Johnson, P.D. , Introduction to Information Theory and Data Compression, CRC Press, 2003.
  • MacKay, D. , Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
  • Shannon, C. E., & Weaver, W., The mathematical theory of communication. Urbana: University of Illinois Press. 1949.

Additional Reading Materials

  • Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013.
  • Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA.
  • Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.
  • Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts.