We provide a condition under which a version of Shannon's Entropy Power Inequality will hold for dependent variables. We first provide a Fisher information inequality extending that found in the independent case. The key ingredients are a conditional expectation representation for the score function of a sum, and the de Bruijn identity which relates entropy and Fisher information.
|Translated title of the contribution||A Conditional Entropy Power Inequality for Dependent Variables|
|Pages (from-to)||1581 - 1583|
|Number of pages||3|
|Journal||IEEE Transactions on Information Theory|
|Publication status||Published - Aug 2004|