SciPost Phys. Lect. Notes 100 (2025) ·
published 30 September 2025
|
· pdf
An introduction to numerical large-deviation sampling is provided. First, direct biasing with a known distribution is explained. As simple example, the Bernoulli process is used throughout the text. Next, Markov chain Monte Carlo (MCMC) simulations are introduced. In particular, the Metropolis-Hastings algorithm is explained. As first implementation of MCMC, sampling of the plain Bernoulli model is shown. Next, an exponential bias is used for the same model, which allows one to obtain the tails of the distribution of a measurable quantity. This approach is generalized to MCMC simulations, where the states are vectors of $U(0,1)$ random entries. This allows one to use the exponential or any other bias to access the large-deviation properties of rather arbitrary random processes. Finally, some recent research applications to study more complex models are discussed.