User:OdedSchramm/sb2
From Wikipedia, the free encyclopedia
[edit] The basic idea
Consider a finite state irreducible aperiodic Markov chain M with state space S and (unique) stationary distribution π. Suppose that we come up with a probability distribution μ on the set of maps
with the property that for every fixed
, its image f(s) is distributed according to the transition probability of M from state s. An example of such a probability distribution is the one where f(s) is independent from f(s') whenever
, but it is often worthwhile to consider other distributions. Now let fj for
be independent samples from this distribution on maps f.
Suppose that x chosen randomly according to π and is independent from the sequence fj. (We do now worry for now where this x is comming from.) Then f − 1(x) is also distributed according to π, because π is M-stationary and our assumption on the law of f. Define
Then it follows by induction that for every
Fj(x) is also distributed according to π. Now here is the main point. It may happen that for some
the image of the map Fn is a single element of S. In other words, Fn(x) = Fn(y) for each
. Therefore, we do not need to have access to x in order to compute Fn(x). The algorithm then involves finding some
such that Fn(S) is a singleton, and outputing the element of that singleton. The design of a good distribution μ for which the task of finding such an n and computing Fn is not too costly is not always obvious, but has been accoplished successfully in several important instances[citation needed].
[edit] The monotone case
There is a special class of Markov chains in which there are particularly good choices for μ and a tool for determining if | Fn(S) | = 1. (Here
denotes cardinality.) Suppose that S is a partially ordered set with order
, which has a unique minimal element s0 and a unique maximal element s1; that is, every
satisfies
. Also, suppose that μ may be chosen to be supported on the set of monotone maps
. Then it is easy to see that | Fn(S) | = 1 if and only if Fn(s0) = Fn(s1), since Fn is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing n: = n0 for some constant n0, sampling the maps
, and outputing Fn(s0) if Fn(s0) = Fn(s1). If
the algorithm proceeds by doubling n and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps f − j which were already sampled; it uses the previously sampled maps when needed.)
[edit] An example
A simple example to illustrate the method is the following. (A more general example of the same flavor can be found in Propp and Wilson (1998).) Suppose that G is a reasonably large but finite graph. We consider a Markov chain M whose state space S is the collection of independent sets in V = V(G). In other words, each
is a set of vertices of G where no edge of G has both endpoints in s. The transitions of M are as follows. Starting from
, one picks a vertex
randomly uniformly and picks a number t, which is 1 with probability p and − 1 with probability 1 − p, independently from v. Here,
is some constant. If t = − 1, the chain moves from s to
. If t = 1 and
, the chain moves to
. Otherwise, the chain stays in s. This is one move. If the graph G is reasonably large and with not too many edges, then the state space of S is so large that a direct calculation of the stationary distribution is out of reach.


