96 3 Markov Chains
Lemma 3.5 Each column of A
1
represents the steady-state distribution vector for
the Markov chain.
We define a differential matrix as a matrix in which the sum of each column is
zero. The following theorem states that matrices A
i
corresponding to eigenvalues
λ
i
= 1 are all differential matrices.
Theorem 3.6 The expansion matrices A
i
corresponding to eigenvalues λ
i
= 1 are
differential, i.e. σ (A
i
) = 0.
Proof The sum of column j in (3.80) is given by
σ
j
(P
n
) = σ
j
(A
1
) +λ
n
2
σ
j
(A
2
) +λ
n
3
σ
j
(A
3
) +··· (3.90)
Lemma 3.1 on page 85 assures us that P
n
is column stochastic and Theorem 3.5
on page 95 assures us that A
1
is also column stochastic. Therefore, we can write the
above equation as
1 = 1 +λ
n
2
σ
j
(A
2
) +λ
n
3
σ
j
(A
3
) +··· (3.91)
Since this equation is valid for all values of λ
i
and n,wemusthave
σ
j
(A
2
) = σ
j
(A
3
) =···=σ
j
(A
m
) = 0 (3.92)
The above equations are valid for all values of 1 ≤ j ≤ m. Thus all the matrices
A
i
which correspond to λ
i
= 1 are all differential matrices. And we can write
σ (A
2
) = σ (A
3
) =···=σ (A
m
) = 0 (3.93)
This proves the theorem.
The following theorem is related to Theorem 3.2 on page 77. The theorem es-
sentially explains the effect of premultiplying any matrix by a differential matrix.
Theorem 3.7 Given any matrix A and a differential matrix V, then matrix B = VA
will be a differential matrix.
Proof When A is premultiplied by V matrix B results
B = VA (3.94)
Element b
ij
is given by the usual matrix product formula
b
ij
=
m
k=1
v
ik
a
kj
(3.95)