vimwiki/math/markov_chain.wiki

10 lines
249 B
Plaintext
Raw Normal View History

2021-11-05 19:45:01 +00:00
= Markov Chains =
Markov chains are a way of representing states and the probablility of other
states occouring, when a given state is present. A markov chain is a type of
state diagram.
== Explanation ==
A markov chain is often represented as a