We adapt arguments concerning information-theoretic convergence in the Central Limit Theorem to the case of dependent random variables under Rosenblatt mixing conditions. The key is to work with random variables perturbed by the addition of a normal random variable, giving us good control of the joint density and the mixing coefficient. We strengthen results of Takano and of Carlen and Soffer to provide entropy-theoretic, not weak convergence.
|Translated title of the contribution
|Information inequalities and a dependent Central Limit Theorem
|627 - 645
|Number of pages
|Markov Processes and Related Fields
|Published - 2001