Let M = inv(D)*(C'*inv(B)*C) where:
D = (F0 + F1)
C = N*F
B = N*F*N'
F: diagonal positive matrix in Rn x Rn
N: node-arc matrix in Rm x Rn, with elements N(i,j) in set {-1, 0, 1}.
I have that the spectral radius r(M) is in (0,1), that is, the power
serie 1 + M + M^2 + ... is convergent.
Now we perturb matrix F: Fp = F + P, where P is a diagonal positive n
x n matrix with all his elements < 1. We obtain the perturbed matrixs
Dp = (F0 + Fp), Cp = N*Fp, Bp = N*Fp*N' and Mp =
inv(Dp)*(Cp'*inv(Bp)*Cp).
The question is: the spectral radius r(Mp) is less than r(M)?
All the computational tests that I have made confirm it, but i don't
have the reason. On the other hand I know that the frobenius norm |Mp|
is less than |M|, but I have not been able to relate it with r(Mp) <
r(M). |