Q:

A Markov chain has the transition matrix shown below: P =[0.20.8] 0.60.4 (1) If, on the first observation the system is in state 1, what is the probability that it is in state 1 on the next observation? (2) If, on the first observation the system is in state 1, what state is the system most likely to occupy on the next observation? (If there is more than one such state, which is the first one.) (3) If, on the first observation, the system is in state 1, what is the probability that the system is in state 1 on the third observation? (4) If, on the first observation the system is in state 1, what state is the system most likely to occupy on the third observation? (If there is more than one such state, which is the first one.)

Accepted Solution

A:
[tex]$\begin{align}P&=\begin{pmatrix}0.2&0.8\\0.6&0.4\end{pmatrix} \\ &\text{So, we'll have:} \\ P_{1,1}&=0.2\\P_{1,2}&=0.8\\P_{2,1}&=0.6\\P_{2,2}&=0.4\end{align}[/tex](1)[tex]$\begin{align}\text{since }P_{1,1}=0.2,\text{ then the probability is } 0.2\end{align}[/tex](2)[tex]$\begin{align}\text{Since }P_{1,2}>P_{1,1},\text{ then the system most likely occupy state 2 because of higher probability}\end{align}[/tex](3)we can multiply the transition matrix (n-1) times, getting:[tex]$\begin{align}P^2&=\begin{pmatrix}0.2&0.8\\0.6&0.4\end{pmatrix}^2\\&=\begin{pmatrix}0.52&0.48\\0.36&0.64\end{pmatrix} \\ &\text{So,} \\ P_{1,1}&=0.52\end{align}[/tex](4)Using the same reason as (2), the system most likely occupy state 1