Registrar Home | Registrar Search:
Home | Subject Search | Help | Symbols Help | Pre-Reg Help | Final Exam Schedule | My Selections

MIT Subject Listing & Schedule
Fall 2025 Search Results

Searched for:

1 subject found.

6.7480 Information Theory: From Coding to Learning
______

Not offered academic year 2026-2027Graduate (Fall)
Prereq: 6.3700, 6.3800, or 18.05
Units: 3-0-9
Credit cannot also be received for 6.7470
Add to schedule Lecture: MW11-12.30 (4-237)
______
Introduces fundamentals of information theory and its applications to contemporary problems in statistics, machine learning, and computer science. A thorough study of information measures, including Fisher information, f-divergences, their convex duality, and variational characterizations. Covers information-theoretic treatment of inference, hypothesis testing and large deviations, universal compression, channel coding, lossy compression, and strong data-processing inequalities. Methods are applied to deriving PAC-Bayes bounds, GANs, and regret inequalities in machine learning, parametric and non-parametric estimation in statistics, communication complexity, and computation with noisy gates in computer science. Fast-paced journey through a recent textbook with the same title. For a communication-focused version, consider 6.7470.
Y. Polyanskiy
Textbooks (Fall 2025)