Introduction to Information Theory

ENG EC 517

Discrete memoryless stationary sources and channels; Information measures on discrete and continuous alphabets and their properties: entropy, conditional entropy, relative entropy, mutual information, differential entropy; Elementary constrained convex optimization; Fundamental information inequalities: data-processing, and Fano's; Block source coding with outage: weak law of large numbers, entropically typical sequences and typical sets, asymptotic equipartition property; Block channel coding with and without cost constraints: jointly typical sequences, channel capacity, random coding, Shannon's channel coding theorem, introduction to practical linear block codes; Rate-distortion theory: Shannon's block source coding theorem relative to a fidelity criterion; Source and channel coding for Gaussian sources and channels and parallel Gaussian sources and channels (water-filling and reverse water-filling); Shannon's source-channel separation theorem for point-to-point communication; Lossless data compression: Kraft's inequality, Shannon's lossless source coding theorem, variable-length source codes including Huffman, Shannon-Fano-Elias, and Arithmetic codes; Applications; Mini course-project.

FALL 2023 Schedule

Section Instructor Location Schedule Notes
A1 Nazer PSY B35 MW 12:20 pm-2:05 pm

Note that this information may change at any time. Please visit the Student Link for the most up-to-date course information.