ECE 6520 Information Theory (Spring 2021)

Introduction of fundamental information theoretic quantities, their relations and basic inequalities

Lecture 1 (01/19/2021) (Syllabus, Lecture Notes, Review Slides)

  1. Syllabus

  2. Introduction

  3. What is information?

  4. How do we measure information?

  • Lecture video: here.

  • Reference: Chapter 1, Chapter 2.1

Lecture 2 (01/21/2021) (Lecture Notes, Review Slides)

  1. Example of entropy, self information and properties of entropy

  2. Joint self information and joint entropy

  3. Conditional self information, conditional entropy and chain rule

  4. Mutual information

  • Lecture video: here.

  • Reference: Chapters 2.1-2.5

Lecture 3 (01/26/2021) (Lecture Notes, Review Slides)

  1. Relative entropy

  2. More chain rule and more on mutual information

  3. Four important inequalities

    1. Jensen's inequality

  • Lecture video: here.

  • Reference: Chapters 2.3-2.6

Lecture 4 (01/28/2021) (Lecture Notes, Review Slides)

  1. Jensen's inequality

  2. Log sum inequality

  • Lecture video: here.

  • Reference: Chapters 2.6-2.7

  • Homework 1:

    • Chapter 2: 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.9, 2.12, 2.14, 2.15, 2.16, 2.19, 2.20, 2.25, 2.29

Lecture 5 (02/02/2021) (Lecture Notes, Review Slides)

  1. Data Processing Inequality

  2. Fano's inequality

  3. Information Source and Asymptotic Equipartition Property (AEP)

  • Lecture video: here.

  • Reference: Chapters 2.8, 2.10, 3.1

Typical sequences and typical sets

Lecture 6 (02/04/2021) (Lecture Notes, Review Slides)

  1. Asymptotic Equipartition Property (AEP)

  2. Data Compression (Source Coding)

  • Lecture video: here.

  • Reference: Chapters 3.1, 5.3

Entropy rate

Lecture 7 (02/09/2021) (Lecture Notes, Review Slides)

  1. Sources with memory

    1. Markov Chains

    2. Entropy Rate

  • Lecture video: here.

  • Reference: Chapters 4.1, 4.2

Data Compression

Lecture 8 (02/11/2021) (Lecture Notes, Review Slides)

  1. Sources with memory

    1. Hidden Markov Processes

    2. AEP for stationary ergodic processes.

  2. Lossless source code

    1. Introduction

    2. Fixed-to-variable length code

    3. Uniquely decodable and prefix code

  • Lecture video: here.

  • Reference: Chapters 4.2, 4.2, 5.1

Lecture 9 (02/16/2021) (Lecture Notes (Homework 1))

  1. Homework 1

  • Lecture video: here.

Lecture 10 (02/18/2021) (Lecture Notes, Review Slides)

  1. Uniquely decodable and prefix code

  2. Kraft inequality

  3. Optimal codes

  4. Bounds on the optimal code length

  • Lecture video: here.

  • Reference: Chapters 5.1-5.5

Lecture 11 (02/23/2021) (Lecture Notes, Review Slides)

  1. Shannon Codes

  2. Huffman Codes

  3. Optimality of Huffman Codes

  • Lecture video: here.

  • Reference: Chapters 5.5-5.8

  • Homework 2:

    • Chapter 3: 3.2, 3.9

    • Chapter 4: 4.1, 4.3, 4.6, 4.7, 4.9, 4.11, 4.33

    • Chapter 5: 5.1, 5.4, 5.6, 5.8, 5.12, 5.16, 5.24, 5.25, 5.30, 5.39

Lecture 12 (02/25/2021) (Lecture Notes, Review Slides)

  1. Optimality of Huffman Codes

  2. Shannon-Fano-Elias Codes

  3. Revisit Shannon Codes

  • Lecture video: here.

  • Reference: Chapters 5.8-5.9

Lecture 13 (03/02/2021) (Lecture Notes, Review Slides)

  1. Competitive Optimality of Shannon Codes

  2. Universal Compression of Binary Sequences

  3. Lempel-Ziv (LZ78) Universal Compression

  4. Homework 2

  • Lecture video: here.

  • Reference: Chapters 5.10, 13.2, 13.4, 13.5

Lecture 14 (03/04/2021) (Lecture Notes)

  1. Homework 2

  • Lecture video: here.

Channel Capacity

Lecture 15 (03/09/2021) (Lecture Notes, Review Slides)

  1. Channel Modeling

  2. Channel Capacity

  3. Examples

  • Lecture video: here.

  • Reference: Chapters 7.1, 7.4, 7.5

Lecture 16 (03/11/2021) (Lecture Notes, Review Slides)

  1. Examples of Channel Capacity

  2. An Optimization Theorem

  3. Jointly Typical Sequences (Joint AEP)

  • Lecture video: here.

  • Reference: Chapters 7.2, 7.3, 7.6

Lecture 17 (03/16/2021) (Lecture Notes, Review Slides)

  1. Proof of Channel Coding Theorem

  2. Zero-error Codes

  • Lecture video: here.

  • Reference: Chapters 7.7, 7.8, 7.9

  • Homework 3:

    • Chapter 7: 7.1, 7.2, 7.3, 7.4, 7.5, 7.7, 7.8. 7.9, 7.11, 7.12, 7.20, 7.35

Lecture 18 (03/18/2021) (Lecture Notes, Review Slides)

  1. Feedback Capacity

  2. Source-Channel Separation Theorem

  • Lecture video: here.

  • Reference: Chapters 7.12, 7.13

Lecture 19 (03/23/2021) (Lecture Notes)

  1. Homework 3

  • Lecture video: here.

Lecture 20 (03/25/2021) (Lecture Notes, Review Slides)

  1. Differential Entropy

  2. Relative Entropy of Continuous Random Variables

  3. Joint and Conditional Differential Entropy

  4. AEP for continuous Random Variables

  5. Properties

  • Lecture video: here.

  • Reference: Chapters 8.1 - 8.6

Lecture 21 (03/30/2021) (Lecture Notes, Review Slides)

  1. Additive Noise

  2. Capacity of Additive White Gaussian Noise (AWGN) Channel

  3. Capacity of Other Channels

  • Reference: Chapters 9.1 - 9.5

  • Homework 4:

    • Chapter 8: 8.1, 8.2. 8.4, 8.5, 8.8, 8.9, 8.10

    • Chapter 9: 9.1,9.2, 9.3, 9.5, 9.6, 9.7, 9.8

Lecture 22 (04/06/2021) (Lecture Notes)

  1. Homework 4

  • Lecture video: here.

Rate-Distortion Theory

Lecture 23 (04/08/2021) (Lecture Notes, Review Slides)

  1. Quantization

  2. Rate-Distortion Problem

  3. Examples

  • Lecture video: here.

  • Reference: Chapters 10.1 - 10.3

Lecture 24 (04/13/2021) (Lecture Notes, Review Slides)

  1. Gaussian Source with Squared-Error Distortion Measure

  2. Proof of Rate-Distortion Theorem (Achievability)

  • Lecture video: here.

  • Reference: Chapters 10.3, 10.5

Lecture 25 (04/15/2021) (Lecture Notes, Review Slides)

  1. Proof of Rate-Distortion Theorem (Converse)

  2. The Analytical Method for Calculation of Rate-Distortion Function

  • Lecture video: here.

  • Reference: Chapters 10.4, 10.7

  • Homework 5: 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.11, 10.12, 10.13, 10.18, 10.19

Lecture 26 (04/20/2021) (Lecture Notes, Review Slides)

  1. Arimoto-Blahut Algorithm

  2. Separation Theorem

  3. Parallel Gaussian Sources

  • Lecture video: here.

  • Reference: Chapters 10.3.3, 10.4, 10.8

Lecture 27 (04/22/2021) (Lecture Notes)

  1. Homework 5

Network Information Theory

Lecture 28 (04/27/2021) (Lecture Notes, Review Slides)

  1. Slepian-Wolf Coding Theorem

  2. Conditionally typical sequences

  3. Achievability

  4. Converse

  • Lecture video: here.

  • Reference: Chapter 15.4