Markov-chain meaning

(probability theory) A discrete-time stochastic process with the Markov property.
noun
0
0
Advertisement