What is Hamiltonian Monte Carlo?
Hamiltonian Monte Carlo (HMC) is a Markov Chain Monte Carlo (MCMC) sampling technique used to sample from complex, high-dimensional probability distributions, such as those encountered in Bayesian inference. HMC relies on concepts from classical mechanics, specifically Hamiltonian dynamics, to explore the target distribution more efficiently than traditional MCMC methods like the Metropolis-Hastings algorithm or the Gibbs sampler.
How does Hamiltonian Monte Carlo work?
Hamiltonian Monte Carlo works by introducing auxiliary momentum variables and turning the target distribution into a joint distribution over position and momentum variables. The system then evolves according to Hamiltonian dynamics, which consists of alternating position (gradient) updates and momentum updates. The key insight is that the Hamiltonian dynamics maintains the desired probability distribution, allowing the sampler to traverse the distribution more efficiently and with fewer random walk steps.
Example of Hamiltonian Monte Carlo in Python:
To use Hamiltonian Monte Carlo in Python, you can use the popular PyMC3 library:
$ pip install pymc3
Here’s a simple example of using HMC for a Bayesian linear regression problem:
import numpy as np import pymc3 as pm # Generate synthetic data np.random.seed(42) x = np.random.randn(100) y = 3 * x + 2 + np.random.normal(scale=1, size=100) # Define the Bayesian model with pm.Model() as model: alpha = pm.Normal("alpha", mu=0, sigma=10) beta = pm.Normal("beta", mu=0, sigma=10) sigma = pm.HalfNormal("sigma", sigma=1) mu = alpha + beta * x y_obs = pm.Normal("y_obs", mu=mu, sigma=sigma, observed=y) # Sample using Hamiltonian Monte Carlo (HMC) trace = pm.sample(2000, tune=1000, target_accept=0.9) # Print the summary of the posterior distribution pm.summary(trace).round(2)
In this example, we generate synthetic data for a linear relationship with noise and then use HMC to infer the posterior distribution of the model parameters.