Markov process


Definition: Meaning of, Markov process in English to English dictionary.

Pronunciation: / ˈprəʊsɛs /

  • noun
  • synonym
  • antonym
Word Forms:
Singular Plural
Markov process Markov processes
  1. a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
    SYNONYM
    ANTONYM
    Not found!
synonym
antonym