EE 512-Consider a Markov Chain

Consider a Markov Chain {Xn, n>=0} with PNN=1(instead of in the picture shown in attached picture),where PNNis the one step transition probability, that is, PNN=P(X1=N|X0=N). Let P(i) denote the probability that this chain eventually enters state N given that it starts in state i. Show that {P(Xn), n>=0} is a Martingale.

Order Solution Now

Similar Posts