Oxford Lectures 2025

From Werner KRAUTH

(Difference between revisions)
Jump to: navigation, search
Revision as of 13:05, 28 January 2025
Werner (Talk | contribs)
(Lecture 2: 28 January 2025)
← Previous diff
Revision as of 13:09, 4 February 2025
Werner (Talk | contribs)

Next diff →
Line 23: Line 23:
[http://www.lps.ens.fr/%7Ekrauth/images/9/9a/WK_Lecture2_Oxford2025.pdf Here are the notes for Lecture 2.] [http://www.lps.ens.fr/%7Ekrauth/images/9/9a/WK_Lecture2_Oxford2025.pdf Here are the notes for Lecture 2.]
 +
 +===Lecture 3: 04 February 2025===
 +In this third lecture, we consider Markov-chain sampling in an abstract setting
 +in one dimension. We discuss some theory, but also seven ten-line pseudo-code
 +algorithms, none of them approximate, and all of them as intricate as they are short.
 +At the end, we discuss the foundations of statistical mechanics, as seen in a one-
 +dimensional example.
 +
 +[http://www.lps.ens.fr/%7Ekrauth/images/8/80/WK_Lecture3_Oxford2025.pdf Here are the notes for Lecture 3.]
 +
 +===Lecture 4: MONDAY 17 February 2025 17h===

Revision as of 13:09, 4 February 2025

My 2025 Public Lectures at the University of Oxford (UK), entitled Algorithms and computations in theoretical physics, run from 21 January 2025 through 11 March 2025.

Contents

Lecture 1: 21 January 2025

We start our parallel exploration of physics and of computing with the concept of sampling, the process of producing examples (“samples”) of a probability distribution. In week 1, we consider “direct” sampling (the examples are obtained directly) and, among the many connections to physics, will come across the Maxwell distribution. In 1859, it marked the beginning of the field of statistical physics.

Here are the notes for the first lecture (21 January 2025).


Lecture 2: 28 January 2025

We start with a Special Topic A, where we supplement the first Lecture with two all-important special topics that explore the meaning of convergence in statistics, and the fundamental usefulness of statistical reasoning. We can discuss them here, in the direct-sampling framework, but they are more generally relevant. The strong law of large numbers, for example, will turn into the famous ergodic theorem for Markov chains.

Here are the notes for Special Topic A (28 January 2025).

Then, in this second lecture proper, we consider Markov-chain sampling, from the adults' game on the Monte Carlo heliport to modern ideas on non-reversibility. We discuss a lot of theory, but also six ten-line pseudo-code algorithms, none of them approximate, and all of them as intricate as they are short.

Here are the notes for Lecture 2.

Lecture 3: 04 February 2025

In this third lecture, we consider Markov-chain sampling in an abstract setting in one dimension. We discuss some theory, but also seven ten-line pseudo-code algorithms, none of them approximate, and all of them as intricate as they are short. At the end, we discuss the foundations of statistical mechanics, as seen in a one- dimensional example.

Here are the notes for Lecture 3.

Lecture 4: MONDAY 17 February 2025 17h

Personal tools