Tag Archives: information theory

Information theory background I

This is the first of a series of two posts with a goal of providing a background on information theory. For today, the itinerary is as follows: we will start with introducing basic information quantities, proceed with proving  tensorization  statements for them — i.e. that they … Continue reading

Posted in Course on Concentration Inequalities | Tagged | Leave a comment

Chernoff bounding is good enough

There is a way to demonstrate that Chernoff bounding is — in some sense — optimal, and it relies to the so-called Sanov’s theorem, which controls the empirical distribution (as a random measure) in terms of the Kullback-Leibler divergence from the … Continue reading

Posted in Course on Concentration Inequalities | Tagged | Leave a comment