Monotonicity, thinning, and discrete versions of the Entropy Power Inequality

Research output: Contribution to journalArticle (Academic Journal)peer-review

30 Citations (Scopus)

Abstract

We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannon's EPI is the behavior of entropy on scaling of continuous random variables. We believe that Rényi's operation of thinning discrete random variables plays a similar role to scaling, and give a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning. In the spirit of the monotonicity results established by Artstein, Ball, Barthe, and Naor, we prove a stronger version of concavity of entropy, which implies a strengthened form of our discrete EPI.
Translated title of the contributionMonotonicity, thinning and discrete versions of the Entropy Power Inequality
Original languageEnglish
Pages (from-to)5387 - 5395
Number of pages9
JournalIEEE Transactions on Information Theory
Volume56
Issue number11
DOIs
Publication statusPublished - Nov 2010

Fingerprint

Dive into the research topics of 'Monotonicity, thinning, and discrete versions of the Entropy Power Inequality'. Together they form a unique fingerprint.

Cite this