학술논문

Relative entropy and exponential deviation bounds for general Markov chains
Document Type
Conference
Source
Proceedings. International Symposium on Information Theory, 2005. ISIT 2005. Information Theory Information Theory, 2005. ISIT 2005. Proceedings. International Symposium on. :1563-1567 2005
Subject
General Topics for Engineers
Communication, Networking and Broadcast Technologies
Signal Processing and Analysis
Entropy
Tin
Random variables
Computational modeling
Distributed computing
Mathematics
Computer science
Laboratories
Steady-state
Information analysis
Language
ISSN
2157-8095
2157-8117
Abstract
We develop explicit, general bounds for the probability that the normalized partial sums of a function of a Markov chain on a general alphabet would exceed the steady-state mean of that function by a given amount. Our bounds combine simple information-theoretic ideas together with techniques from optimization and some fairly elementary tools from analysis. In one direction, we obtain a general bound for the important class of Doeblin chains; this bound is optimal, in the sense that in the special case of independent and identically distributed random variables it essentially reduces to the classical Hoeffding bound. In another direction, motivated by important problems in simulation, we develop a series of bounds in a form which is particularly suited to these problems, and which apply to the more general class of "geometrically ergodic" Markov chains