Link to home
Start Free TrialLog in
Avatar of jagguy
jagguyFlag for Australia

asked on

markov chain sequence

In  markov chain you have a transition matrix and initial state.

It is easy to work out the probability on the nth event but what about a sequence of events .

the 1st column .75 is prob a day is wet today given it was wet yesterday .
the 1st column .7 is prob a day is not wet given it was not wet yesterday .

 t= .75  .3
     .25  .7  

what is the probability of a at least 2 days being wet out of the next 3 if it was wet yesterday.

using a markov chain how do i do this?
Avatar of TommySzalapski
TommySzalapski
Flag of United States of America image

Start from the state of it was wet yesterday and then work out all the possible final states for the next 3 days (you should have 8 of them). Now find the paths that have 2 or more wet days and add up the probabilities.
P(W|W)*P(W|W)
+
P(D|W)*P(W|D)*P(W|W)
This is a three level tree with eight possible outcomes.
You care about four of them.
ASKER CERTIFIED SOLUTION
Avatar of ozo
ozo
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of jagguy

ASKER

ok so what is the answer and the s0 matrix as it goes along?
You know that we can't just solve the whole problem for you. This is an academic question.

What do you have for it so far?