Entropy and the Law of Small Numbers

I Kontoyiannis, P Harremoes, OT Johnson

Research output: Contribution to journalArticle (Academic Journal)peer-review

61 Citations (Scopus)

Abstract

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when $S_n=\sum_{i=1}^nX_i$ is the sum of the (possibly dependent) binary random variables $X_1,X_2,\ldots,X_n$, with $E(X_i)=p_i$ and $E(S_n)=\la$, then $$ D(P_{S_n}\|\Pol) \leq \sum_{i=1}^n p_i^2 \;+\; \Big[\sum_{i=1}^nH(X_i) \,-\,H(X_1,X_2,\ldots, X_n)\Big],$$ where $D(P_{S_n}\| Po(\la))$ is the relative entropy between the distribution of $S_n$ and the Poisson distribution. The first term in this bound measures the individual smallness of the $X_i$ and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the $X_i$ are independent, the following sharper bound is established, $$ D(P_{S_n}\|\Pol)\leq \frac{1}{\lambda} \sum_{i=1}^n \frac{p_i^3}{1-p_i},$$ and it is also generalized to the case when the $X_i$ are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.
Translated title of the contributionEntropy and the Law of Small Numbers
Original languageEnglish
Pages (from-to)466 - 472
Number of pages7
JournalIEEE Transactions on Information Theory
Volume51 (2)
DOIs
Publication statusPublished - Feb 2005

Bibliographical note

Publisher: IEEE - Institute for Electrical Electronics Engineering

Fingerprint

Dive into the research topics of 'Entropy and the Law of Small Numbers'. Together they form a unique fingerprint.

Cite this