- 마르코프 연쇄 — Markov chain (English)
- Sequence of random variables (Xn) satisfying the Markov property, that is, such that Xn+1 (the future) depends only on Xn (the present) and not on Xk for k <= n-1 (the past).
Translate 마르코프 연쇄
Learn how to say "Markov chain" in other languages:
Browse our dictionary
Find other interesting words in Korean and English by browsing through our dictionary: