Markov chain mixing time

In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution. More precisely, a fundamental result about Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution π and, regardless of the initial state, the time-t distribution of the chain converges to π as t tends to infinity. Mixing time refers to any of several variant formalizations of the idea: how large must t be until the time-t distribution is approximately π ? One variant, variation distance mixing time, is defined as the smallest t such that the total variation distance of probability measures is small:

Markov chain mixing time

In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution. More precisely, a fundamental result about Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution π and, regardless of the initial state, the time-t distribution of the chain converges to π as t tends to infinity. Mixing time refers to any of several variant formalizations of the idea: how large must t be until the time-t distribution is approximately π ? One variant, variation distance mixing time, is defined as the smallest t such that the total variation distance of probability measures is small: