In markov chain you have a transition matrix and initial state.
It is easy to work out the probability on the nth event but what about a sequence of events .
the 1st column .75 is prob a day is wet today given it was wet yesterday .
the 1st column .7 is prob a day is not wet given it was not wet yesterday .
t= .75 .3
what is the probability of a at least 2 days being wet out of the next 3 if it was wet yesterday.
using a markov chain how do i do this?