Google Answers Logo
View Question
 
Q: Probability Stochastic Markov ( No Answer,   1 Comment )
Question  
Subject: Probability Stochastic Markov
Category: Science > Math
Asked by: mathhelpreq-ga
List Price: $15.00
Posted: 11 Mar 2005 13:34 PST
Expires: 17 Mar 2005 12:07 PST
Question ID: 492824
let m_ij=[R_j|X(0)=i]
Let P be the trabsition probability matrix of a finite state regular Markov Chain.
i) Use first step argument to establish that

m_ij= 1 + sum (K not equal to j) P_ik m_kj

ii) Multiply both sides of the preceeding by piei and sum to obtain:

sum (over all i) piei m_ij - sim(over al i) piei 
+ sum(where k not equal j) sum(over all i) piei P_ik m_kj

simplify this to show 
piej m_ij = 1 or piej=1/m_ij

Please explain how to do this... I know that m_ij is the expected min
time that the process will take to go from state i to state j and
since it requires a minimum one step to do this we have the first
step.. the next steps are the expected value of the steps needed to
reach j from i. But how do we show this mathematically
Answer  
There is no answer at this time.

Comments  
Subject: Re: Probability Stochastic Markov
From: mathtalk-ga on 13 Mar 2005 12:06 PST
 
I think the best way to "show this mathematically" is to express the
various facts using matrix arithmetic.  But first it may help to
clarify our thinking about the underlying ideas.

A finite state Markov chain is said to be regular if its probability
transition matrix P has a natural number power P^n whose entries are
all positive.  This implies that the states are "connected", ie. that
it is always possible to get from any state to any other.  In
particular there are no "absorbing" states (a state which transitions
to itself with probability 1) unless there is actually only one state
(a trivial case usually excluded from further consideration).

In these circumstances there is probability 1 of reaching state j in
the future, given that the present state is i.  As I read the problem
posed here, m_i,j represents the expected number of steps (given that
the present state is i) before reaching state j in the future.  This
can be a bit confusing to think about for the case i = j.  It does
make sense to define m_i,i even if it is possible to go directly from
state i to itself, in such a way that its value is greater than or
equal to 1.  The point to insist on is reaching state i "in the
future", as a transition from i to i would indeed count as one step
into the future.

regards, mathtalk-ga

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy