Markov-chain definition

(probability theory) A discrete-time stochastic process with the Markov property.
noun
0
0
Advertisement

Other Word Forms

Noun

Singular:
markov-chain
Plural:
Markov chains