\documentclass[a4paper,12pt]{article}
\newcommand{\ds}{\displaystyle}
\newcommand{\pl}{\partial}
\parindent=0pt
\begin{document}


{\bf Question}

Determine whether or not the Markov chain with the following
probability transition matrix, $P,$ is ergodic i.e. possesses a
limiting distribution independently of the initial distribution.
$$P = \left( \begin{array}{cccc} \frac{1}{2} & \frac{1}{2} & 0 & 0
\\ \frac{1}{2} & \frac{1}{2} & 0 & 0 \\ 0 & 0 & \frac{1}{2} &
\frac{1}{2} \\ 0 & 0 & \frac{1}{2} & \frac{1}{2} \end{array}
\right)$$ What conclusions do you draw?


\vspace{.25in}

{\bf Answer}

The Markov chain is not irreducible.

The state space partitions into two closed irreducible sets of
states: $\{1,2\}$ and $\{3,4\},$ so the limiting distribution will
depend on the initial distribution.  The Markov chains consisting
of $(1,2)$ and $(3,4)$ are each irreducible and so all states are
ergodic, since they are clearly aperiodic.  So this example shows
that finite aperiodic Markov chain's which are not irreducible
need not be ergodic even if each of their states is ergodic. The
stationary distribution not unique. The following is stationary
for all values of $p:$



$$\left( \begin{array}{c} p \\ p \\ \frac{1}{2} - p \\ \frac{1}{2}
- p \end{array} \right)$$



\end{document}
