Markov-chain Definition

noun
(probability theory) A discrete-time stochastic process with the Markov property.
Wiktionary
Synonyms:
  • Markoff chain

Other Word Forms of Markov-chain

Noun

Singular:
markov-chain
Plural:
Markov chains

Find Similar Words

Find similar words to markov-chain using the buttons below.

Words Starting With

Words Ending With

Unscrambles

markov-chain