Stopping circle.py

From Werner KRAUTH

Jump to: navigation, search

Contents

Context

This page is part of my 2024 Beg Rohu Lectures on "The second Markov chain revolution" at the Summer School "Concepts and Methods of Statistical Physics" (3 - 15 June 2024).

In Lecture 5 of these lectures, we continue our exploration of Markov-chain Monte Carlo algorithms that sample perfectly from a given distribution. In Lecture 2, we already encountered beautiful such algorithms for card shuffling and for Diffusion_CFTP.py, and in the present Lecture 5, we will encounter more general perfect-sampling algorithms for hard spheres and for the Ising model.

All this is nice, but isn't there a simple perfect-sampling algorithm that one can actually understand, say, for a single particle on a one-dimensional graph?? Here it is, for the a particle on an N-cycle (a path graph of N+1 vertices 0, 1, 2, ..., N, with periodic boundary conditions, where we identify vertex N with vertex 0): Start at vertex 0: with probability 1/N: output this vertex. Otherwise, perform a random walk, moving with probability 1/2 to the left and with probability 1/2 to the right. Keep track of the not-visited vertices. Output the final non-visited vertex. The vertex that are output are equally distributed.

Python program

What was explained above is more easily expressed in terms of a tiny Python program:

import random
N_trials = 100000
N = 6
data = [0] * N
for iter in range(N_trials):
    x = 0
    if random.uniform(0.0, 1.0) < 1.0 / N:
        data[x] += 1.0 / N_trials
    else:
        NotVisited = set([k for k in range(N)])
        NotVisited.discard(x)
        while len(NotVisited) > 0:
            sigma = random.choice([-1, 1])
            x = (x + sigma) % N
            NotVisited.discard(x)
        data[x] += 1.0 / N_trials
print('stopping samples')
for k in range(N):
    print('site = ', k,' probability = ', data[k])


Output

Here is output of the above Python program for N=6. We do 100,000 trials, and see that the six vertices are output with equal probability.

site =  0  probability =  0.1673
site =  1  probability =  0.1671
site =  2  probability =  0.1645
site =  3  probability =  0.1657
site =  4  probability =  0.1656
site =  5  probability =  0.1696

Further Information

  • What works like a charm for the random walk on the cycle fails for all other graphs except the complete graph (see Lovász and Winkler (1993)).
  • The coupling from the past approach discussed in Lecture 2 is much more robust.

References

  • Lovász, L., Winkler, P., On the last new vertex visited by a random walk, J. Graph Theory 17, 593 (1993)
  • Levin, D. A., Peres, Y. & Wilmer, E. L. Markov Chains and Mixing Times (American Mathematical Society, 2008)
Personal tools