Markov chain
English
Noun
Markov chain (plural Markov chains)
- (probability theory) A discrete-time stochastic process containing a Markov property.
- 2004 July 27, F. Keith Barker et al., “Phylogeny and diversification of the largest avian radiation”, in PNAS, page 11040, column 2:
- The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.
Hypernyms
Hyponyms
- discrete-time Markov chain
Translations
probability theory
|
See also
This article is issued from Wiktionary. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.