|
|
Subject:
Spectral Radius 3
Category: Science > Math Asked by: dooogle-ga List Price: $50.00 |
Posted:
12 Sep 2005 03:38 PDT
Expires: 12 Oct 2005 03:38 PDT Question ID: 567051 |
Dear MathTalk, Following on from http://answers.google.com/answers/threadview?id=553457 Tightening the question parameters still further: Let M be a square matrix of size n. Let M be row stochastic. Let M be irreducible and aperiodic. Let V be another square matrix of the same size n, where all the rows of V are given by the same row vector v of size n. Could you please walk me through the theory showing how one would one go about choosing vector v so that the spectral radius of the matrix given by (M - V) is less than one? I've found your examples thusfar very useful, so any more would be much appreciated I have concocted a method for dealing with the absolute value of (M-V), and shall ask about this in due course, Here, though, I am not concerned with the absolute value. Thanking you again, Dooogle. | |
| |
|
|
There is no answer at this time. |
|
Subject:
Re: Spectral Radius 3
From: mathtalk-ga on 14 Sep 2005 16:19 PDT |
If a probability transition matrix M is irreducible, then by the Perron-Frobenius theorem it has a simple eigenvalue 1. Without the aperiodic condition (which is implied by having a row or column of nonzero values), there could be other eigenvalues (nonpositive, probably complex) of equal magnitude (absolute value), but of course none of greater modulus. With aperiodicity all the other eigenvalues are of properly smaller modulus. A rank-one update to M of the form you specified can "remove" this eigenvalue, in the sense of reducing it from 1 to 0 (or any place in between). Of course the spectral radius would then equal the absolute value of the "next largest" eigenvalue, whatever that might be. The appropriate "adjustment" to M is some scalar multiple of: V = (1,1,...,1)'*(v_1,...,v_n) = u'*v where v = (v_1,...,v_n) is a _left_ eigenvector of M corresponding to the eigenvalue 1 (by definition of probability transition matrix, column vector u' is the right eigenvector of M corresponding to the eigenvalue 1, by inspection). Computationally the main difficulty is finding v. One way is to solve the homogeneous system, for a non-trivial (nonzero) solution to: v(M - I) = 0 However solving this exactly is probably more work than is necessary if you are satisfied with an approximation to v, as I suspect you will be. As long as the approximation is at least partly in the direction of v (has positive inner product with v), then it will tend to reduce that eigenvalue (possibly at the risk of increasing another one). So the use of an approximate eigenvector v is of interest, I think. The simplest way to produce and refine such an approximate eigenvector is the power method. Given a "random" initial row vector v(0), one defines a sequence: v(i+1) = (1/||v(i)||) v(i)M where the scaling of the vector can be done in a variety of convenient ways. In a case like this where the vector is in fact not expected to grow much, if any, the scaling can be skipped for several steps at a time. Thus it is of importance that the largest eigenvalue is simple and unique, ie. the consequence of irreducibility plus aperiodicity. In effect whatever component of the initial guess exists in the direction of the eigenvector v, it will be preserved under multiplication by M while the other components shrink at more or less of a geometric rate. I went into some detail about convergence of this sort of power iteration here: [Google Answers: Page Rank Definition - Proof of convergence and uniqueness] http://answers.google.com/answers/threadview?id=379557 regards, mathtalk-ga |
Subject:
Re: Spectral Radius 3
From: dooogle-ga on 27 Sep 2005 07:06 PDT |
Dear MathTalk, Thank you for the comment. Clarified things for me. Have you some theory to show that this is indeed the "appropriate adjustment" which "removes" the eigenvalue. I do hope so, as I'd hate to think you've gone to all the trouble which you have without suitable reward. I liked very much your description in the diagonalisability case. But, you suggested that the adjustment transcends diagonalisability. Could you please explain your reasoning? Many thanks, Dooogle. |
Subject:
Re: Spectral Radius 3
From: mathtalk-ga on 27 Sep 2005 18:53 PDT |
Here's a discussion of the spectral radius of M-V versus M that makes no mention of diagonalization (nor is that a "hidden" assumption). Let M be a (row) stochastic nonnegative matrix with simple eigenvalue 1, and all other eigenvalues of modulus strictly smaller than 1. Let (column) u' be a right eigenvector corresponding to eigenvalue 1: u = (1,1,...,1) Let v be the left eigenvector corresponding to eigenvalue 1, normalized so that: v*u' = 1 Such a normalization is possible since v may be chosen nonnegative. * * * * * * * * * * * * * * * * * * * * * * * * What can be said about M-V, where: V = u'*v is the rank one "update" matrix which we've been discussing? First let's compute a couple of easy things: v*(M-V) = v*M - v*u'*v = v - v = 0 (M-V)*u' = M*u' - u'*v*u' = u' - u' = 0 Clearly (M-V) is a singular matrix, and in particular u' and v are respectively right and left eigenvectors corresponding to eigenvalue 0. The eigenvalues of M which are not equal to 1 are eigenvalues of M-V, as we will show by calculating that any left eigenvector of M for such an eigenvalue is still a left eigenvector of M-V for the same eigenvalue. [The argument adapts easily to right eigenvectors, but the fact is that one side suffices to characterize the spectrum of M.] Let w be a left eigenvector of M corresponding to eigenvalue µ not equal to 1 (possibly complex). That is, nonzero vector w (possibly with complex entries) satisfies: w*M = µw Now w*u' = w*M*u' = µ(w*u'). Since µ is not equal to 1, this means: w*u' = 0 Thus w*(M-V) = w*M - w*u'*v = µw - 0v = µw. In other words w is a left eigenvector of M-V corresponding to µ, as was to be shown. Thus every eigenvalue µ of M other than 1 is again an eigenvalue of M-V (since every eigenvalue has a left eigenvector). Next we will show that M-V has no nonzero eigenvalues that are not also eigenvalues of M. This is a slight variation o argument we just did. Let w be a left eigenvector of M corresponding to eigenvalue µ not equal to zero: w*(M-V) = µw Since µ(w*u') = w*(M-V)*u' = w*(0u') = 0(w*u') and µ is nonzero, it must be that: w*u' = 0 Therefore: µw = w*(M-V) = w*M - w*u'*v = w*M - 0v = w*M This establishes that w is a left eigenvector of M corresponding to µ, as was to be shown. In particular all the nonzero eigenvalues of M-V are also eigenvalues of M (since each has a left eigenvector). One final thing needs to be shown, namely that 1 is not an eigenvalue of M-V. As we argued just above, if it were, since 1 is nonzero, any left eigenvector w of M-V corresponding to 1 would satisfy: w*u' = 0 and moreover w*M = w. But v*u' = 1 and v*M = v, which contradicts the simplicity of the eigenvalue 1 of M since w could not be a scalar multiple of v. We can summarize these findings as follows. The spectrum of M-V is the set (of complex values) obtained from the spectrum of M by: Spec(M-V) = ( Spec(M)\{1} ) U {0} ie. by "replacing" eigenvalue 1 with eigenvalue 0. Consequently the spectral radius of M-V is the modulus of the "second largest" eigenvalue of M. As has been noted before, it is enough to reduce the spectral radius that we subtract from M some small positive multiple cV, c < 1, instead of "entirely removing" V. regards, mathtalk-ga |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |