1
$\begingroup$

In the literature, a Markov chain (which can be either in continuous or discrete time) is a stochastic process that evolves in a finite or countable state space. A Markov process, on the other hand, is usually defined as a stochastic process with a continuous state space.

However, in all papers and books I have read, when describing the Metropolis–Hastings (or more generally MCMC) algorithm, it is stated that the goal is to construct a Markov chain whose stationary distribution is the target distribution. The proposal of a new state is drawn from a proposal distribution, typically a normal distribution centered at the current state. In this case, since the states lie in a continuous space, I would be inclined to call it a Markov process rather than a chain.

So my question is: why is it called a “chain” and not a “process”?

$\endgroup$

1 Answer 1

0
$\begingroup$

Depending on the context, a chain and process can be used interchangeably. Metropolis–Hastings is a step-wise algorithm, so even though it may sample a continuous space. It is a chain due to previous configuration effecting the next; having only single-step memory so to speak in its naive form. Process usually implies a serial correlations, being continuous or discrete.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.