Metropolis harmonic.py
From Werner KRAUTH
(Difference between revisions)
Revision as of 18:13, 19 February 2025 Werner (Talk | contribs) ← Previous diff |
Current revision Werner (Talk | contribs) |
||
Line 1: | Line 1: | ||
==Context== | ==Context== | ||
- | This page is part of my [[BegRohu_Lectures_2024|2025 Oxford Lectures]] | + | This page is part of my public lectures on [https://www.physics.ox.ac.uk/events/algorithms-and-computations-theoretical-physics-set-lectures Algorithms and Computations in theoretical physics] at the University of Oxford (see the announcement for a syllabus). For more information, see [[Oxford_Lectures_2025|the main page of the public lectures]]. It presents the Metropolis Monte Carlo algorithm invented by Metropolis et al. (1953). For an in-depth discussion, see my below reference. Other programs in the same series are the [[HMC_harmonic.py|Hamiltonian Monte Carlo]] algorithm and the [[Levy_harmonic.py|Lévy construction]], both for the harmonic model. |
==Python program== | ==Python program== | ||
Line 17: | Line 17: | ||
L = 16 | L = 16 | ||
delta = 1.0 | delta = 1.0 | ||
- | + | x = [L / N * k for k in range(N)] | |
Iter = 1000000 | Iter = 1000000 | ||
for iter in range(Iter): | for iter in range(Iter): | ||
- | xstart = random.uniform(0.0, L) | + | k = random.randint(0, N - 1) # random slice |
- | xend = xstart + L | + | k_plus = (k + 1) % N |
- | x = [xstart] | + | k_minus = (k - 1) % N |
- | for k in range(1, N): # loop over internal slices | + | x_plus = x[k_plus] |
- | x_mean = ((N - k) * x[k - 1] + xend) / (1.0 + N - k) | + | if k == N - 1: x_plus += L |
- | sigma = math.sqrt(1.0 / (1.0 + 1.0 / (N - k) )) | + | if k_plus == N: x_plus = x[0] + L |
- | x.append(random.gauss(x_mean, sigma)) | + | x_minus = x[k_minus] |
- | x.append(xend) | + | if k == 0: x_minus -= L |
- | + | x_new = x[k] + random.uniform(-delta, delta) # new position at slice k | |
+ | U_k = ((x_plus - x[k]) ** 2 + (x[k] - x_minus) ** 2) / 2.0 | ||
+ | U_kp = ((x_plus - x_new) ** 2 + (x_new - x_minus) ** 2) / 2.0 | ||
+ | if random.uniform(0.0, 1.0) < math.exp(- (U_kp - U_k)): x[k] = x_new | ||
==Further information== | ==Further information== | ||
==References== | ==References== | ||
- | Krauth, W. XXX | + | Krauth, W. [http://arxiv.org/pdf/2411.11690 Hamiltonian Monte Carlo vs. event-chain Monte Carlo: an appraisal of sampling strategies beyond the diffusive regime]. ArXiv: 2411.11690 |
+ | [[Category:Python]] [[Category:Oxford2025]] |
Current revision
Contents |
[edit]
Context
This page is part of my public lectures on Algorithms and Computations in theoretical physics at the University of Oxford (see the announcement for a syllabus). For more information, see the main page of the public lectures. It presents the Metropolis Monte Carlo algorithm invented by Metropolis et al. (1953). For an in-depth discussion, see my below reference. Other programs in the same series are the Hamiltonian Monte Carlo algorithm and the Lévy construction, both for the harmonic model.
[edit]
Python program
import math, random def U(x): U = 0.0 for k in range(N): k_minus = (k - 1) % N x_minus = x[k_minus] if k == 0: x_minus -= L U += (x[k] - x_minus) ** 2 / 2.0 return U N = 8 L = 16 delta = 1.0 x = [L / N * k for k in range(N)] Iter = 1000000 for iter in range(Iter): k = random.randint(0, N - 1) # random slice k_plus = (k + 1) % N k_minus = (k - 1) % N x_plus = x[k_plus] if k == N - 1: x_plus += L if k_plus == N: x_plus = x[0] + L x_minus = x[k_minus] if k == 0: x_minus -= L x_new = x[k] + random.uniform(-delta, delta) # new position at slice k U_k = ((x_plus - x[k]) ** 2 + (x[k] - x_minus) ** 2) / 2.0 U_kp = ((x_plus - x_new) ** 2 + (x_new - x_minus) ** 2) / 2.0 if random.uniform(0.0, 1.0) < math.exp(- (U_kp - U_k)): x[k] = x_new
[edit]
Further information
[edit]
References
Krauth, W. Hamiltonian Monte Carlo vs. event-chain Monte Carlo: an appraisal of sampling strategies beyond the diffusive regime. ArXiv: 2411.11690