Levy harmonic.py

From Werner KRAUTH

(Difference between revisions)
Jump to: navigation, search
Revision as of 18:13, 19 February 2025
Werner (Talk | contribs)
(Python program)
← Previous diff
Revision as of 18:31, 19 February 2025
Werner (Talk | contribs)

Next diff →
Line 1: Line 1:
==Context== ==Context==
-This page is part of my [[BegRohu_Lectures_2024|2025 Oxford Lectures]]+This page is part of my public lectures on [https://www.physics.ox.ac.uk/events/algorithms-and-computations-theoretical-physics-set-lectures Algorithms and Computations in theoretical physics] at the University of Oxford (see the announcement for a syllabus). For more information, see [[Oxford_Lectures_2025|the main page of the public lectures]]. It presents the hamiltonian Monte Carlo algorithm invented by Duane et al. (1988). For an in-depth discussion, see my below reference. Other programs in the same series are the Metropolis algorithm and the Lévy construction.
==Python program== ==Python program==
Line 33: Line 33:
==References== ==References==
-Krauth, W. XXX+Krauth, W. [http://arxiv.org/pdf/2411.11690 Hamiltonian Monte Carlo vs. event-chain Monte Carlo: an appraisal of sampling strategies beyond the diffusive regime]. ArXiv: 2411.11690

Revision as of 18:31, 19 February 2025

Contents

Context

This page is part of my public lectures on Algorithms and Computations in theoretical physics at the University of Oxford (see the announcement for a syllabus). For more information, see the main page of the public lectures. It presents the hamiltonian Monte Carlo algorithm invented by Duane et al. (1988). For an in-depth discussion, see my below reference. Other programs in the same series are the Metropolis algorithm and the Lévy construction.

Python program

import math, random

def U(x):
    U = 0.0
    for k in range(N):
        k_minus = (k - 1) % N
        x_minus = x[k_minus]
        if k == 0: x_minus -= L
        U += (x[k] - x_minus) ** 2 / 2.0
    return U

N = 8
L = 16
delta = 1.0

Iter = 1000000

for iter in range(Iter):
    xstart = random.uniform(0.0, L)
    xend = xstart + L
    x = [xstart]
    for k in range(1, N):        # loop over internal slices
        x_mean = ((N - k) * x[k - 1] + xend) / (1.0 + N - k)
        sigma = math.sqrt(1.0 / (1.0  + 1.0 / (N - k) ))
        x.append(random.gauss(x_mean, sigma))
    x.append(xend)

Further information

References

Krauth, W. Hamiltonian Monte Carlo vs. event-chain Monte Carlo: an appraisal of sampling strategies beyond the diffusive regime. ArXiv: 2411.11690

Personal tools