A Conditional Entropy Power Inequality for Dependent Variables

Research output: Contribution to journalArticle (Academic Journal)peer-review

12 Citations (Scopus)

Abstract

We provide a condition under which a version of Shannon's Entropy Power Inequality will hold for dependent variables. We first provide a Fisher information inequality extending that found in the independent case. The key ingredients are a conditional expectation representation for the score function of a sum, and the de Bruijn identity which relates entropy and Fisher information.
Translated title of the contributionA Conditional Entropy Power Inequality for Dependent Variables
Original languageEnglish
Pages (from-to)1581 - 1583
Number of pages3
JournalIEEE Transactions on Information Theory
Volume50 (8)
DOIs
Publication statusPublished - Aug 2004

Bibliographical note

Publisher: IEEE - Institute of Electrical Electronic Engineers

Fingerprint

Dive into the research topics of 'A Conditional Entropy Power Inequality for Dependent Variables'. Together they form a unique fingerprint.

Cite this