 Winnie的学习手册

V1

2022/08/26阅读：11主题：前端之巅同款

# MCMC sampling introduction

## 1 Motivation

### 1.1 Monte Carlo (MC)

where is iid sample from the posterior distribution .

However, obtaining the iid sample is difficult, and the complete form of the posterior distribution must be known.

### 1.2 MCMC (Markov Chain MC)

MCMC generally generates a random sequence satisfying markov properties, which is ergodic and the limit distribution is . Based on Markov chains, get "not independent" samples from that have the same effect as iid sample.

## 2 Basic properties

The Markov chain satisfies: stationary distribution, ergodic (or irreducile and aperiodic).

• Markov property

• Transition Probability (kernel)

Homogeneous: the transition kernel are the same for all .

• Marginal Distribution

## 3. Stationary (Invariant) Chains

A -finite measure is invariant for the transition kernel (and for the associated chain) if

The invariant distribution is also referred to as stationary if is a probability measure, since implies that for every ; thus, the chain is stationary in distribution.

## 4. Detailed balance condition

### Detailed balance condition

A Markov chain with transition kernel satisfies the detailed balance condition (reversible) if there exists a function satisfying

for every .

Proof.

Remark
Detailed balance condition provides a sufficient but not necessary condition for to be a stationary measure associated with the transition kernel .

### Theorem

Suppose that a Markov chain with transition function satisfies the detailed balance condition with a probability density function. Then:
(1) The density is the invariant density of the chain.
(2) The chain is reversible.

Reversible:

## 5. Ergodicity

To prove the convergence (independence of initial conditions)

• positive recurrent
State is positive recurrent, if where is the first time to return state .
• aperiodic
A state has period ,

where gcd is greatest common divisor. Aperiodic means the gcd of any state is 1.

## 6. Irreducible

For any state , the probability of this chain going from state to state is positive. Namely, for ,

### The law of large numbers for Markov chains

Suppose is a Markov chain with countable state space , and the transition probability matrix is , and suppose that it is irreducible and has stationary distribution . Then for any bounded functions and any initial distribution, have

Remark.

• A given Markov chain may have more than one invariant distribution.
• Stationary + ergodicity equilibrium distribution (unique)
• Stationary + irreducible + aperiodic unique(A ergodic Markov chain is irreducible. ) V1