10 lines
249 B
Plaintext
10 lines
249 B
Plaintext
= Markov Chains =
|
|
|
|
Markov chains are a way of representing states and the probablility of other
|
|
states occouring, when a given state is present. A markov chain is a type of
|
|
state diagram.
|
|
|
|
== Explanation ==
|
|
|
|
A markov chain is often represented as a
|