## Markov process

[mär**′**kôf]

a chain of random events in which only the present state influences the next future state, as in a genetic code

## Markov process

Noun

(*plural* Markov processes)

- (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.

- Markov property
- Markov chain