Markov-jump-process Definition

noun

(mathematics) A time-dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on.

Wiktionary

Other Word Forms of Markov-jump-process

Noun

Singular:
markov-jump-process
Plural:
Markov jump processes

Find Similar Words

Find similar words to markov-jump-process using the buttons below.

Words Starting With

Words Ending With

Unscrambles

markov-jump-process

Word Length