Introduction: Bayesian Inference

At the heart of Bayesian statistics lies Bayes’ theorem, which describes how we update our beliefs about parameters θ given observed data D:

\[P(\theta \mid D) = \frac{P(D \mid \theta) \cdot P(\theta)}{P(D)}\]

Where:

The Challenge

The denominator P(D) requires integrating over all possible parameter values:

\[P(D) = \int P(D \mid \theta) \cdot P(\theta) \, d\theta\]

For most real-world problems, this integral is intractable — impossible to compute analytically. This is where sampling algorithms come in.

Why We Need Samplers

Instead of computing the posterior distribution directly, samplers generate representative samples from P(θ | D). With enough samples, we can:


📌 Available Samplers

This page gathers different sampling algorithms I experiment with, mostly in the context of Bayesian inference and high-dimensional problems. Each sampler has its own dedicated page with interactive visualizations, implementation details, and diagnostics.


🔹 Markov Chain Monte Carlo (MCMC)

A foundational Metropolis–Hastings sampler that forms the basis for understanding modern sampling methods. This implementation serves as a reference for exploring fundamental concepts like:

While not the most efficient for complex posteriors, MCMC remains invaluable for building intuition and as a diagnostic baseline.

➡️ Explore the MCMC sampler
Tags: Bayesian inference, Metropolis-Hastings, diagnostics, autocorrelation


🔹 Hamiltonian Monte Carlo (HMC)

A gradient-based sampler that treats sampling as a physics simulation problem.

HMC leverages Hamiltonian dynamics to propose distant states with high acceptance probability, making it particularly effective for:

By simulating the motion of a particle with momentum through the posterior landscape, HMC can traverse the distribution much more efficiently than random-walk methods.

➡️ Explore the HMC sampler Tags: HMC, gradients, high-dimensional inference, Hamiltonian dynamics


🔹 Nested sampling

➡️ Explore Nested sampling Tags: HMC, gradients, high-dimensional inference, Hamiltonian dynamics


🔹 MCMC Parralel tempering

➡️ Explore MCMC parralel tempering Tags: HMC, gradients, high-dimensional inference, Hamiltonian dynamics


🔧 Implementation Notes

All samplers are implemented with:


📚 Further Reading