Among Markov chain Monte Carlo (MCMC) algorithms, coupling from the past is a method for sampling from the stationary distribution of a Markov chain. Contrary to many MCMC algorithms, coupling from the past gives in principle a perfect sample from the stationary distribution. It was invented by James Propp and David Wilson in 1996.

The basic idea

Consider a finite state irreducible aperiodic Markov chain with state space and (unique) stationary distribution ( is a probability vector). Suppose that we come up with a probability distribution on the set of maps with the property that for every fixed , its image is distributed according to the transition probability of from state . An example of such a probability distribution is the one where is independent from whenever , but it is often worthwhile to consider other distributions. Now let for be independent samples from .

Suppose that is chosen randomly according to and is independent from the sequence . (We do not worry for now where this is coming from.) Then is also distributed according to , because is -stationary and our assumption on the law of . Define

Then it follows by induction that is also distributed according to for every . However, it may happen that for some the image of the map is a single element of . In other words, for each . Therefore, we do not need to have access to in order to compute . The algorithm then involves finding some such that is a singleton, and outputting the element of that singleton. The design of a good distribution for which the task of finding such an and computing is not too costly is not always obvious, but has been accomplished successfully in several important instances.[1]

The monotone case

There is a special class of Markov chains in which there are particularly good choices for and a tool for determining if . (Here denotes cardinality.) Suppose that is a partially ordered set with order , which has a unique minimal element and a unique maximal element ; that is, every satisfies . Also, suppose that may be chosen to be supported on the set of monotone maps . Then it is easy to see that if and only if , since is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing for some constant , sampling the maps , and outputting if . If the algorithm proceeds by doubling and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps which were already sampled; it uses the previously sampled maps when needed.)

References

  1. "Web Site for Perfectly Random Sampling with Markov Chains".
  • Propp, James Gary; Wilson, David Bruce (1996), Proceedings of the Seventh International Conference on Random Structures and Algorithms (Atlanta, GA, 1995), pp. 223–252, MR 1611693
  • Propp, James; Wilson, David (1998), "Coupling from the past: a user's guide", Microsurveys in discrete probability (Princeton, NJ, 1997), DIMACS Ser. Discrete Math. Theoret. Comput. Sci., vol. 41, Providence, R.I.: American Mathematical Society, pp. 181–192, doi:10.1090/dimacs/041/09, ISBN 9780821808276, MR 1630414, S2CID 2781385
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.