- 马尔可夫链 — Markov chain (English)
- Sequence of random variables (Xn) satisfying the Markov property, that is, such that Xn+1 (the future) depends only on Xn (the present) and not on Xk for k <= n-1 (the past).
Learn how to say "Markov chain" in other languages:
Browse our dictionary
Find other interesting words in Chinese and English by browsing through our dictionary: