BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Communication in the presence of sparsity - Yiannis Kontoyiannis (
 Cambridge)
DTSTART:20190529T130000Z
DTEND:20190529T140000Z
UID:TALK125320@talks.cam.ac.uk
CONTACT:HoD Secretary\, DPMMS
DESCRIPTION:In his seminal 1948 work\, Claude Shannon determined the funda
 mental limits of\nthe best achievable performance in point-to-point commun
 ication. His analysis\ndepended on two assumptions: That data are communic
 ated in asymptotically\nlarge blocks\, and that the information sources an
 d the noise channels involved\nsatisfy certain statistical regularity prop
 erties. We revisit the central\ninformation-theoretic problems of efficien
 t data compression and channel\ntransmission without these assumptions. Fi
 rst\, we describe a general\ndevelopment of non-asymptotic coding theorems
 \, providing useful expressions\nfor the fundamental limits of performance
  on finite data. These results may\nplay an important role in applications
  where performance or delay guarantees\nare critical\, and they offer valu
 able operational guidelines for the design of\npractical compression algor
 ithms and error correcting codes. Second\, motivated\nby modern applicatio
 ns involving sparse and often “big” data (e.g.\, in\nneuroscience\, we
 b and social network analysis\, medical imaging\, and optical\nmedia recor
 ding)\, we state and prove a series of theorems that determine the\nbest a
 chievable rates of compression and transmission\, when the information or\
 nthe noise are appropriately “sparse”. Interestingly\, in these cases\
 , the\nclassical results in terms of the entropy and channel capacity are 
 shown to be\ninaccurate\, even as first-order approximations.\n\nNo backgr
 ound in information theory will be assumed.\n
LOCATION:Centre for Mathematical Sciences\, meeting room MR14
END:VEVENT
END:VCALENDAR
