The "spectral radius" of a (square) matrix is the "maximum modulus"
(largest absolute value) of its eigenvalues. This odd name is due to
terming the set of eigenvalues the "spectrum" of the matrix.
Because an n×n matrix has at most n eigenvalues, there is no ambiguity
of speaking of the maximum of the absolute values of these
"characteristic roots", but there may distinct eigenvalues with the
same modulus or an eigenvalue that has multiplicity greater than 1
(either algebraic multiplicity, which counts the number of equal roots
in the characteristic polynomial, or geometric multiplicity, which
counts the number of linearly independent eigenvectors that correspond
to the eigenvalue).
Every characteristic root has at least one (nonzero) eigenvector, so
there is no distinction between "algebraic" and "geometric"
eigenvalues, only between the two kinds of multiplicity.
If we have a scalar multiple of a square matrix, say M = c*A, then the
relationship between their spectrums (eigenvalues) is just what we'd
expect and hope for. Assuming c is nonzero:
d is an eigenvalue of A <==> c*d is an eigenvalue of M
For this reason we are assured that the spectral radius of M is |c|
times the spectral radius of A, and this true without assuming
nonnegativity of A or any other properties of a probabilty transition
If we know some information about the 1-norms of the rows of A (as for
example we do in the case of probability transition matrix), then we
may be able to give a bound on the spectral radius of A, and thus of
For example, if the largest 1-norm (sum of absolute values) of the
rows of A is C, then we have this estimate on how right-multiplication
of rows by A affects the 1-norm. For any compatible row x:
xA = x(1) * A(1,-) + x(2) * A(2,-) + . . . _ x(n) * A(n,-)
Therefore by the triangle inequality for the 1-norm ||.|| we have:
||xA|| =< |x(1)|*||A(1,-)|| + ... + |x(n)|*||A(n,-)||
=< C * ( |x(1)| + |x(2)| + ... + |x(n)| )
= C * ||x||
Therefore the absolute value of any eigenvalue d of A would be less
than or equal to C, as we see by this "calculation":
xA = dx
||dx|| = ||xA|| =< C * ||x||
|d| * ||x|| =< C * ||x||
and dividing by ||x|| which is nonzero:
|d| =< C
Therefore the spectral radius of A, which is the maximum of |d|, will
be bounded by C, which is the maximum 1-norm of a row of A. Similarly
one can get an estimate of the spectral radius involving the maximum
1-norm of a column of A (by considering eigenvectors from the "other
side" of A), and the estimates could be based on the operator norm of
A induced by other vector norms (besides the 1-norm as we used here).
A more "delicate" method for estimating the locations of eigenvalues
and hence the spectral radius is given by a result known as
Gershgorin's Disk (or Circles) Theorem:
[Gershgorin's Circle Theorem - PlanetMath]
The idea here is to use a bounding circle centered on a diagonal
element of A and the radius of which is the 1-norm of the remaining
off-diagonal entries. The union of these n-disks (on for each
diagonal entry) contains all the eigenvalues of A.