Eigenvalues

 
 

Eigenvalues of a Linear Map

Let _ &phi.#: F^{~n} &hdash.&rightarrow. F^{~n} _ be a linear map. If _ &phi.#{~u} = λ#{~u} _ for some λ &in. F, _ then #{~u} is said to be an ~#{eigenvector} of &phi. with corresponding ~#{eigenvalue} λ.

Eigenvalues of a Matrix

If &phi. has corresponding matrix A relative to basis _ #{~u}_1 ... #{~u}_{~n} _ &in. F^{~n}, _ then

#{~u} = &sum._{~i} &alpha._{~i}#{~u}_{~i} _ is an eigenvector of &phi. _ _ &iff. _ _ A#{&alpha.} = λ#{&alpha.} _ for some λ &in. F , _ _ where #{&alpha.} = (&alpha._1 ... &alpha._{~n} )^T.

Then λ is said to be an ~#{eigenvalue} of A with corresponding ~#{eigenvector} #{&alpha.}.

Characteristic Equation

If ~n # ~n matrix A has eigenvalue λ with corresponding eigenvector #{&xi.}, then _ A#{&xi.} = λ#{&xi.} _ so _ (A &minus. λI ) #{&xi.} = #0.
This has a non trivial solution _ &iff. _ | A &minus. λI | = 0.

Now _ | A &minus. λI | _ is a polynomial in λ of degree ~n over F, called the ~#{characteristic polynomial} of A, _ and _ | A &minus. λI | = 0 _ is the ~#{characteristic equation} of A. _ This has at most ~n roots, so an ~n # ~n matrix A has at most ~n eigenvalues.

Note that A - λI is just A with λ subtracted from each of its principal diagonal elements.

An alternative definition of eigenvalues of a matrix A being the roots of the characteristic equation of A.

Zero Eigenvalues

An eigenvalue can be zero, i.e. A#{&xi.} = #0. _ This has a non-trivial solution _ &iff. _ | A | = 0 _ &iff. _ A is not regular.

&therefore. ~{a regular matrix has no zero eigenvalues}.

Minors

Consider the 2 # 2 matrix _ _ A _ = _ matrix{a_{11},a_{12}/a_{21},a_{22}} , _ _ then _ _ A - λI _ = _ matrix{a_{11} - λ,a_{12}/a_{21},a_{22} - λ} .

This has determinant _ (a_{11} - λ)(a_{22} - λ) - a_{12}a_{21}

= _ λ^2 - (a_{11} + a_{22})λ + a_{21}a_{22} - a_{12}a_{21} _ = _ λ^2 - tr(A)λ + det(A) .

It can be shown that for an ~n # ~n matrix A:

| A - λI | _ = _ sum{m_~i ({-}λ)^{~n - ~i},~i = 0, ~n} _ = _ m_0 ({-}λ)^{~n} + m_1 ({-}λ)^{~n - 1} + ... + m_~n

where m_0 = 1 and m_{~i} is the sum of all the principal ~i^{th} order of A.

This conjecture will not be proved here.

[ Note that a principal ~{first} order minor is just an element of the diagonal, so we have that m_1 = &sum._{~i} ~a_{~i ~i} called the ~{trace} of A.
Also the principal ~n^{th} order minor of A is just the determinant of A. ]

Eigenvectors (Linear Space of)

If λ is an eigenvalue of A with corresponding eigenvector #{&xi.} , _ then A (&mu.#{&xi.}) = &mu. A#{&xi.} = &mu. λ#{&xi.} = λ (&mu.#{&xi.}).
So &mu.#{&xi.} is also an eigenvector of A with corresponding eigenvalue λ.

If #{&xi.}_1 and #{&xi.}_2 are two linearly independent eigenvectors corresponding to the same eigenvalue λ _ then
A (&mu.#{&xi.}_1 + &nu.#{&xi.}_2) = &mu.A#{&xi.}_1 + &nu.A#{&xi.}_2 = &mu.λ#{&xi.}_1 + &nu.λ#{&xi.}_2 = λ (&mu.#{&xi.}_1 + &nu.#{&xi.}_2).
So &mu.#{&xi.}_1 + &nu.#{&xi.}_2 is also an eigenvector of A with corresponding eigenvalue λ.

We can conclude that the eigenvectors of A corresponding to a single eigenvalue, λ, form a linear subspace of F^{~n}, _ and that its dimension is _ ~n &minus. rank( A &minus. λI ).
[ (A &minus. λI)#{&xi.} = #0 _ &iff. _ #{&xi.} &in. ker &chi. , _ where &chi. is the linear map associated with A &minus. λI relative to some basis. _ dim( ker &chi. ) = ~n &minus. dim( im &chi. ) = ~n &minus. rank(A &minus. λI). ]

I.e. any eigenvalue λ of A can have _ ~n &minus. rank( A &minus. λI ) _ linearly independent eigenvectors.

Polynomials of Eigenvalues

#{Lemma}: If _ p(~t) is a polynomial over F _ and _ λ is an eigenvalue of matrix A, _ then _ p(λ) is also is an eigenvalue of p(A), _ with the same corresponding eigenvector.

Proof: _ If _ A#{&xi.} = λ #{&xi.} , _ then

  • A^2#{&xi.} = A# λ#{&xi.} = λ A#{&xi.} = λ^2 #{&xi.} , _ so inductively (A^{~n} ) #{&xi.} = (λ^{~n} ) #{&xi.} ,
  • (&mu.A#)#{&xi.} = &mu.A#{&xi.} = &mu.λ#{&xi.} = (&mu.λ)#{&xi.} ,
  • (B + A) #{&xi.} = B #{&xi.} + A #{&xi.} = &kappa. #{&xi.} + λ #{&xi.} = (&kappa. + λ) #{&xi.} _ , _ [where B #{&xi.} = &kappa. #{&xi.}]

So _ p(A)#{&xi.} = p(λ)#{&xi.} , _ any polynomial p( ).

Van der Monde's Determinant

For use in the following proposition, consider the ~r # ~r matrix, ~{Van der Monde's Matrix}, defined by

Λ _ = _ matrix{ 1,λ_1, ... ,λ_1^{~r&minus.1}/ :,:,,:/ 1,λ_{~r}, ... ,λ_{~r}^{~r&minus.1} } _ = _ script{rndb{λ_{~i}^{~k}},,,~i = 1 ... ~r,~k = 0 ... ~r&minus.1}

This is a regular matrix with determinant

det{Λ} _ = _ prod{ ( λ_{~i} &minus. λ_{~j} ),~i > ~j, _ }

which is known as ~{Van der Monde's Determinant}.

Linear Independence of Eigenvectors

If ~n # ~n matrix A has ~r distinct eigenvalues λ_1 ... λ_{~r} then the corresponding eigenvectors #{&xi.}_1 ... #{&xi.}_{~r} are linearly independent.

Proof:
Suppose

sum{ &alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ #0 , _ _ some &alpha._{~i} &in. F

Let p(~t) be any polynomial over F, so by the lemma:

sum{ p( λ_{~i} ) &alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ sum{&alpha._{~i} ( p( λ_{~i} ) #{&xi.}_{~i} ),~i = 1, ~r}

_ _ _ _ _ = _ sum{&alpha._{~i} ( p(A) #{&xi.}_{~i} ),~i = 1, ~r} _ = _ p(A) sum{&alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ #0

In particular

sum{array{/},~i = 1, ~r} ( sum{ ~b_{~k} &lamda._{~i} ^{~k&minus.1} ,~k = 1, ~r} ) &alpha._{~i} #{&xi.}_{~i} _ = _ #0 , _ _ for any set of ~b_{~k} &in. F

But we can find ~b_{~k} such that _ &sum._{~k} ~b_{~k} &lamda._{~i} ^{~k&minus.1} = &delta._{~i ~j} _ for any 1 &le. ~j &le. ~r.

Consider Van der Monde's matrix defined above. This is regular so _ &exist. #{~b} = ( ~b_1 ... ~b_{~r} ) such that Λ#{~b} = #{~e}_{~j} has a non-trivial solution for each _ ~j _ &imply. _ &exist. ~b_1 ... ~b_{~r} ( dependent on _ ~j ) such that _ &sum._{~k} ~b_{~k} &lamda._{~i} ^{~k&minus.1} = &delta._{~i ~j} , _ so:

sum{&delta._{~i ~j} &alpha._{~i} #{&xi.}_{~i} ,~i = 1, ~r} _ = _ #0 _ _ => _ _ &alpha._{~j} _ = _ 0

This can be done for each ~j. So we have proved linear independence.

#{Corollaries}:

  1. Since each eigenvalue can have ~n &minus. rank( A &minus. λ_{~i} I ) linearly independent eigenvectors, we have

    sum{~n &minus. rank( A &minus. λ_{~i} I ),~i = 1, ~n} _ _ _ _ =< _ _ _ _ ~n _ _ _ _ ( = _ _ _ _ dim F^{~n} )

  2. If A has ~n distinct eigenvalues, its eigenvectors span F^{~n}.

Eigenvalues and Regularity

If ~n # ~n matrix A has ~n eigenvalues λ_1 ... λ_{~n} (not necessarily distinct) with corresponding eigenvectors #{&xi.}_1 ... #{&xi.}_{~n}, _ then

  1. λ_1 ... λ_{~n} distinct _ &imply. _ #{&xi.}_1 ... #{&xi.}_{~n} are linearly independent.
    _
  2. If #{&xi.}_1 ... #{&xi.}_{~n} are linearly independent, (but λ_1 ... λ_{~n} not necessarily distinct) _ then
    _

    A _ _ _ _ ~ _ _ _ _ L _ = _ matrix{ λ_1,0, ... ,0/0,λ_2, ... ,0/:,:,,:/0,0, ... ,λ_{~n}}


    _
  3. Suppose &mu._1 , ... &mu._{~n} &in. F, and for ~n # ~n matrix A we have
    _

    A _ _ _ _ ~ _ _ _ _ M _ = _ matrix{ &mu._1,0, ... ,0/0,&mu._2, ... ,0/:,:,,:/0,0, ... ,&mu._{~n}}

    then _ &mu._1 ... &mu._{~n} are eigenvalues of A, and the corresponding eigenvectors are linearly independent.
    _

Proof:

  1. Has already been demonstrated.
    _
  2. Construct the matrix P = ( #{&xi.}_1 ... #{&xi.}_{~n} ) , _ i.e. the matrix whose columns are eigenvectors of A. This is regular since #{&xi.}_1 ... #{&xi.}_{~n} are linearly independent. So P^{&minus.1}P = I_{~n} , _ Consider the ~i^{th} column of both sides , this gives _ P^{&minus.1} #{&xi.}_{~i} = #{#e}_{~i} .
    Now the ~i^{th} column of P^{&minus.1}AP is _ P^{&minus.1} A #{&xi.}_{~i} = P^{&minus.1} λ_{~i} #{&xi.}_{~i} = λ_{~i} P^{&minus.1} #{&xi.}_{~i} = λ_{~i} #{#e}_{~i} . _ So P^{&minus.1}AP = L _ as defined above.
    _
  3. &exist. ~n # ~n matrix Q _ such that Q^{&minus.1}AQ = M . _ Let #{&chi.}_{~i} be the ~i^{th} column of Q, _ so _ Q^{&minus.1} #{&chi.}_{~i} = #{#e}_{~i} _ ( as Q^{&minus.1}Q = I_{~n} )
    Consider the ~i^{th} column of Q^{&minus.1}AQ , _ this is _ Q^{&minus.1} A #{&chi.}_{~i} = &mu._{~i} #{#e}_{~i} _ ( the ~i^{th} column of M ) = &mu._{~i} Q^{&minus.1} #{&chi.}_{~i} = Q^{&minus.1} &mu._{~i} #{&chi.}_{~i} .
    So _ A#{&chi.}_{~i} = Q ( Q^{&minus.1} A #{&chi.}_{~i} ) = Q ( Q^{&minus.1} &mu._{~i} #{&chi.}_{~i} ) = &mu._{~i} #{&chi.}_{~i} , _ so &mu._{~i} eigenvalue of A with corresponding eigenvector #{&chi.}_{~i} . _ The #{&chi.}_{~i} are linearly independent since Q is regular.
    _

#{Corollaries}:

  1. ~n # ~n matrix A has at most ~n distinct eigenvalues.
    _
  2. If A has ~n linearly independent eigenvectors with corresponding eigenvalues λ_1 ... λ_{~n} then A &tilde. L (as defined in 2 above). So rank A = rank L = ~n (as the ~n rows of L are linearly independent), so A is regular.
    _
  3. A as in 2, then

    P^{-1}AP _ = _ matrix{λ_1, ... ,0/:,,:/0, ... ,λ_{~n}} _ _ => _ _ P^{&minus.1}A^{&minus.1}P _ = _ (P^{&minus.1}AP)^{&minus.1} _ = _ matrix{λ_1^{&minus.1}, ... ,0/:,,:/0, ... ,λ_{~n}^{&minus.1}}

    so λ_1^{&minus.1} ... λ_{~n}^{&minus.1} are the eigenvalues of A^{&minus.1} (by 3) with same eigenvectors as A (i.e. the columns of P).
    [ A regular and A#{&xi.} = λ#{&xi.} _ &imply. _ #{&xi.} = A^{&minus.1}A#{&xi.} = A^{&minus.1}λ#{&xi.} = λA^{&minus.1}#{&xi.} _ &imply. _ λ^{&minus.1}#{&xi.} = A^{&minus.1}#{&xi.} ]
    _
  4. We have

    P^{&minus.1}AP _ = _ matrix{λ_1, ... ,0/:,,:/0, ... ,λ_{~n}} _ _ => _ _ det{P^{&minus.1}AP} _ = _ prod{λ_{~i},~i = 1,~n}

    Now _ | P^{&minus.1}AP | _ = _ | P^{&minus.1} | | A | | P | _ = _ | A | | P^{&minus.1} | | P | _ = _ | A | | P^{&minus.1}P | _ = _ | A | | I_{~n} | _ = _ | A |.
    So the determinant of a matrix is the product of its eigenvalues.
    _
  5. Note that we have shown that if A has ~n distinct eigenvalues, or has ~n linearly independent eigenvectors then A is regular. _ We have ~{not} shown the converse.
    _

#{Summary}: ~n # ~n matrix A

  1. ~n distinct eigenvalues &imply. eigenvectors linearly independent.
  2. eigenvectors linearly independent &imply. A regular.
  3. eigenvectors linearly independent &imply. P^{&minus.1}AP = L , where columns of P are the eigenvectors of A, L diagonal matrix of eigenvalues.
  4. &exist. P such that P^{&minus.1}AP = L &imply. columns of P are the ~n linearly independent eigenvectors of A, L diagonal matrix of eigenvalues.