Abstract
We provide a condition under which a version of Shannon's Entropy Power Inequality will hold for dependent variables. We first provide a Fisher information inequality extending that found in the independent case. The key ingredients are a conditional expectation representation for the score function of a sum, and the de Bruijn identity which relates entropy and Fisher information.
| Translated title of the contribution | A Conditional Entropy Power Inequality for Dependent Variables |
|---|---|
| Original language | English |
| Pages (from-to) | 1581 - 1583 |
| Number of pages | 3 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 50 (8) |
| DOIs | |
| Publication status | Published - Aug 2004 |
Bibliographical note
Publisher: IEEE - Institute of Electrical Electronic EngineersFingerprint
Dive into the research topics of 'A Conditional Entropy Power Inequality for Dependent Variables'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver