*Markov chain* is a probabilistic model describing a system that changes from state to state,
and in which the probability of the system being in a certain state at a certain time step depends only on the
state of the preceding time step. The probability that the $j$ is the next state of the chain, given that the
current state is state $i$, is called the *transition probability from $i$ to $j$*.
The matrix below gives the transition probabilities. The entry in row $i$, column $j$ is the transition
probability from $i$ to $j$. Note that entries in each rows necessarily add up to 1.
The graphical representation below shows the states, and the possible state transitions.
Click “Run” to start the Markov chain. Click “Examples” in the top menu to choose
a different example, or change the probabilities yourself to experiment!