User:OdedSchramm/sb2

From Wikipedia, the free encyclopedia

[edit] The basic idea

Consider a finite state irreducible aperiodic Markov chain M with state space S and (unique) stationary distribution π. Suppose that we come up with a probability distribution μ on the set of maps f:S\to S with the property that for every fixed s\in S, its image f(s) is distributed according to the transition probability of M from state s. An example of such a probability distribution is the one where f(s) is independent from f(s') whenever s\ne s', but it is often worthwhile to consider other distributions. Now let fj for j\in\mathbb Z be independent samples from this distribution on maps f.

Suppose that x chosen randomly according to π and is independent from the sequence fj. (We do now worry for now where this x is comming from.) Then f − 1(x) is also distributed according to π, because π is M-stationary and our assumption on the law of f. Define

F_j:= f_{-1}\circ f_{-2}\circ\cdot\circ f_{-j}.

Then it follows by induction that for every j\in\mathbb N Fj(x) is also distributed according to π. Now here is the main point. It may happen that for some n\in\mathbb N the image of the map Fn is a single element of S. In other words, Fn(x) = Fn(y) for each y\in S. Therefore, we do not need to have access to x in order to compute Fn(x). The algorithm then involves finding some n\in \mathbb N such that Fn(S) is a singleton, and outputing the element of that singleton. The design of a good distribution μ for which the task of finding such an n and computing Fn is not too costly is not always obvious, but has been accoplished successfully in several important instances[citation needed].

[edit] The monotone case

There is a special class of Markov chains in which there are particularly good choices for μ and a tool for determining if | Fn(S) | = 1. (Here |\cdot| denotes cardinality.) Suppose that S is a partially ordered set with order \le, which has a unique minimal element s0 and a unique maximal element s1; that is, every s\in S satisfies s_0\le s\le s_1. Also, suppose that μ may be chosen to be supported on the set of monotone maps f:S\to S. Then it is easy to see that | Fn(S) | = 1 if and only if Fn(s0) = Fn(s1), since Fn is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing n: = n0 for some constant n0, sampling the maps f_{-1},\dots,f_{-n}, and outputing Fn(s0) if Fn(s0) = Fn(s1). If F_n(s_0)\ne F_n(s_1) the algorithm proceeds by doubling n and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps f j which were already sampled; it uses the previously sampled maps when needed.)

[edit] An example

A simple example to illustrate the method is the following. (A more general example of the same flavor can be found in Propp and Wilson (1998).) Suppose that G is a reasonably large but finite graph. We consider a Markov chain M whose state space S is the collection of independent sets in V = V(G). In other words, each s\in S is a set of vertices of G where no edge of G has both endpoints in s. The transitions of M are as follows. Starting from s\in S, one picks a vertex v\in V randomly uniformly and picks a number t, which is 1 with probability p and − 1 with probability 1 − p, independently from v. Here, p \in (0,1) is some constant. If t = − 1, the chain moves from s to s\setminus\{v\}. If t = 1 and s\cup\{v\}\in S, the chain moves to s\cup\{v\}. Otherwise, the chain stays in s. This is one move. If the graph G is reasonably large and with not too many edges, then the state space of S is so large that a direct calculation of the stationary distribution is out of reach.