CoE 161

From Microlab Classes
Revision as of 10:53, 24 June 2020 by Louis Alarcon (talk | contribs)
Jump to navigation Jump to search
  • Introduction to Information and Complexity (2018 Curriculum)
    • Advanced course on information theory and computational complexity, starting from Shannon's information theory and Turing's theory of computation, leading to the theory of Kolmogorov complexity.
  • Semester Offered: 2nd semester
  • Course Credit: Lecture: 3 units

Prerequisites

  • EEE 111 (Introduction to Programming and Computation)
  • EEE 137 (Probability, Statistics and Random Processes in Electrical and Electronics Engineering)

Course Goal

  • Introduce fundamental tools and frameworks to understand information and complexity in the design of computer systems.

Specific Goals

  • Introduce fundamental tools for determining the minimum amount of computational resources needed to algorithmically solve a problem.
    • Information Theory
    • Computational Complexity Theory

Content

This course covers information theory and computational complexity in a unified way. It develops the subject from first principles, building up from the basic premise of information to Shannon's information theory, and from the basic premise of computation to Turing's theory of computation. The duality between the two theories leads naturally to the theory of Kolmogorov complexity. The technical topics covered include source coding, channel coding, rate-distortion theory, Turing machines, computability, computational complexity, and algorithmic entropy, as well as specialized topics and projects.

Introduction to Information Theory

Information theory studies the transmission, processing, extraction, and utilization of information.

  • Deterministic and probabilistic information
  • Entropy; Joint, marginal, and conditional ensembles & entropies
  • Equalities and inequalities for entropy; Mutual information
  • Law of large numbers, typical sets
  • Entropy vs. probabilistic information: intuition
  • Entropy vs. probabilistic information: formal proof
  • Representation: coding and decoding; Uniquely decodable,instantaneous codes; Kraft's inequality
  • Optimal codeword length, Huffman codes; Formalizing computation: representation

Computing Theory Fundamentals

Computational Complexity Theory is the study of efficient computation and its fundamental limitations.

  • Finite state automata and limitations
  • Turing machines, stack machines; computability
  • Church-Turing thesis: composition and simulations; A universal machine
  • Undecidability: the halting problem (decidable vs acceptable), the tiling problem
  • Uncomputability: the busy beaver problem
  • Data compression and algorithmic complexity
  • Algorithmic entropy

References

  • Cover, T. M, Thomas, J. A., Elements of Information Theory, 2ed., Wiley-Interscience, 2006. (pdf)
  • Michael Sipser, Introduction to the Theory of Computation, 3rd edition, Cengage Learning, 2013. (pdf)

Additional Reading Materials

  • Sanjeev Arora and Boaz Barak. (2009), Computational Complexity: A Modern Approach (1st ed.), Cambridge University Press, New York, NY, USA. (pdf)
  • Jones, Neil D., Computability and Complexity: From a Programming Perspective, 1997, The MIT Press, Cambridge, Massachusetts (pdf)
  • Jon Kleinberg and Christos Papadimitriou, Computability and Complexity, Computer Science: Reflections on the Field, Reflections from the Field, Natl. Academies Press, 2004.(pdf)
  • Robert M. Gray, Entropy and Information Theory 1st ed. (corrected), Springer-Verlag New York 2013. (pdf)