Download

Paper: Convergence of Markov Chains in Information Divergence


Abstract

Information theoretic methods are used to prove convergence in information divergence of reversible Markov chains. Also some ergodic theorems for information divergence are proved.


Citation

@Article{Harremoes:2009hl,
  Title =	 {Convergence of Markov Chains in Information Divergence},
  Author =	 {Peter Harremoësand Klaus K. Holst},
  Journal =	 {Journal of Theoretical Probability},
  Year =	 2009,
  Month =	 03,
  Number =	 1,
  Pages =	 {186--202},
  Volume =	 22,
  Abstract =	 {Information theoretic methods are used to prove
                  convergence in information divergence of reversible
                  Markov chains. Also some ergodic theorems for
                  information divergence are proved.},
  Doi =		 {10.1007/s10959-007-0133-7},
}