Understanding the Source Coding Theorem: A Talk on Shannon’s Entropy
- 👤 Speaker: Kwing Hei Li, Churchill College
- 📅 Date & Time: Wednesday 21 October 2020, 19:00 - 19:30
- 📍 Venue: Online, via MS Teams
Abstract
How many bits do we need to encode a sequence of English characters? Can we do better by considering the relative frequency of each character? Is there a theoretical limit to how we encode data with negligible risk of information loss?
In this talk, we first define Shannon’s entropy, which quantifies the predictability of a sequence of random variables. We will also explore the Source Coding Theorem, which provides an operational definition of entropy by establishing a fundamental limit to the compressibility of information. Finally, we will prove this theorem with some surprisingly simple results in Probability Theory.
Series This talk is part of the Churchill CompSci Talks series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 21 October 2020, 19:00-19:30