Winter quarter, 2008
Instructor: Carl Bergstrom
Time and place: MW 1:30-3:20, Kincaid 502
Textbook: Cover and Thomas (2007) Elements of Information Theory, second edition. Wiley.
If you are serious about learning information theory, there is no better way to do so than to pick up a copy of this excellent textbook and start reading from the beginning. The book requires an elementary background in probability theory and in calculus, but little else by way of background knowledge.
- Lecture 1: None
- Lecture 2: Cover and Thomas, Chapter 1.
Godfrey-Smith 2007. Information in Biology.
Chapter 7 of National Research Council (2008) The Role of Theory in Advancing 21st Century Biology
- Lecture 3: Cover and Thomas, Chapter 2.
- Lecture 4: Cover and Thomas, Chapter 3.
- Lecture 5: Cover and Thomas, Chapter 4.
- Lectures 6-7: Cover and Thomas, Chapter 5.
- Lecture 8: Cover and Thomas, Chapter 6.
- Lecture 9: Bergstrom and Lachmann (2005)
- Lecture 10: Cover and Thomas, Chapter 7.
- Lecture 11: Cover and Thomas, Chapter 8.
- Lecture 12: Cover and Thomas, Chapter 9.
- Lecture 1: Probability refresher
- Lecture 3: Joint entropy, relative entropy, and mutual information
- Lecture 4: The asymptotic equipartition theorem
- Lecture 5: The entropy rate for stochastic processes
- Lecture 6: Coding theory
- Lecture 7: Optimal coding for noiseless channels
- Lecture 8: Kelly's horse race
- Lecture 10: Noisy coding
- Lecture 11: Differential entropy