a chain of random events in which only the present state influences the next future state, as in a genetic code
(plural Markov processes)
- (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.
- Markov property
- Markov chain