Abstract
We adapt arguments concerning information-theoretic convergence in the Central Limit Theorem to the case of dependent random variables under Rosenblatt mixing conditions.
The key is to work with random variables perturbed by the addition of a normal random variable, giving us good control of the joint density and the mixing coefficient.
We strengthen results of Takano and of Carlen and Soffer to provide entropy-theoretic, not weak convergence.
Translated title of the contribution | Information inequalities and a dependent Central Limit Theorem |
---|---|
Original language | English |
Pages (from-to) | 627 - 645 |
Number of pages | 29 |
Journal | Markov Processes and Related Fields |
Volume | 7 (4) |
Publication status | Published - 2001 |