Linear Maps

Page Contents

Linear Map

All vector spaces of a given dimension, ~n say, over a field F are isomorphic to the space F^{~n}, and therefore isomorphic to each other. We do not therefore need to consider each such space separately, but just a generic ~n-dimensional vector space over F, which we will denote by V_{~n}.

&phi.#: V_{~m} &hdash.&rightarrow. V_{~n} _ is a #{~{linear map}} if

  1. &phi.(λ#{~u}) _ = _ λ(&phi.#{~u})
  2. &phi.(#{~u} + #{~v}) _ = _ &phi.#{~u} + &phi.#{~v}

&forall. λ &in. F, _ #{~u}, #{~v} &in. V_{~m}.

The Vector Space of Linear Maps

Let @L(~m,~n) denote the set of all linear maps &phi.#: V_{~m} &hdash.&rightarrow. V_{~n}. _ If &phi., &psi. &in. @L(~m,~n) define

  • Addition: _ (&phi. + &psi.)#{~u} _ = _ &phi.#{~u} + &psi.#{~u} , _ &forall. #{~u} &in. V_{~m}.
  • Scalar multiplication: _ (λ&phi.)#{~u} _ = _ λ(&phi.#{~u}) _ [ = _ &phi.(λ#{~u})] , _ &forall. λ &in. F, _ #{~u} &in. V_{~m}.

With these operations @L(~m,~n) is a vector space over F.

Kernel and Image

If &phi. &in. @L(~m,~n) define

  • The #{~{kernel}} of &phi., _ ker &phi. = \{ #{~u} &in. V_{~m} &vdash. &phi.#{~u} = #0 &in. V_{~n} \}
  • The #{~{image}} of &phi., _ im &phi. = \{ #{~v} &in. V_{~n} _ for which &exist. #{~u} &in. V_{~m} such that _ &phi.#{~u} = #{~v} \}

ker &phi. is a subspace of V_{~m} , _ im &phi. is a subspace of V_{~n}

we have:ker &phi. = \{#0\} _ &iff. _ &phi. is injective (one-to-one)im &phi. = V_{~n} _ &iff. _ &phi. is surjective (onto)

#{Lemma}: _ dim(ker &phi.) + dim(im &phi.) = dim(V_{~m}) = ~m
Proof:
Let dim(ker &phi.) = ~k, then we can find a basis, #{~u}_1 ... #{~u}_{~k} , for ker &phi. where #{~u}_1 ... #{~u}_{~k},#{~u}_{~k+1} ... #{~u}_{~m} is a basis for V_{~m}.
#{~v} &in. im &phi. _ then for some #{~u} &in. V_{~m}

#{~v} _ = _ &phi.#{~u} _ _ = _ _ sum{λ_{~i}&phi.#{~u}_{~i},1,~m}

_ _ = _ sum{λ_{~i}&phi.#{~u}_{~i},1,~k} + sum{λ_{~i}&phi.#{~u}_{~i},~k+1,~m} _ = _ sum{λ_{~i}&phi.#{~u}_{~i},~k+1,~m}

since &phi.#{~u}_{~i} = #0 (~i = 1 ... ~k). So \{&phi.#{~u}_{~i}\}_{~i = ~k+1 ... ~m} span im &phi.. They are also linearly independent, for suppose otherwise, i.e.

sum{&alpha._{~i}&phi.#{~u}_{~i},~k+1,~m} _ = _ #0 , _ _ _ _ _ _ _ _ &alpha._{~i} not all zero

then:

sum{&alpha._{~i}#{~u}_{~i},~k+1,~m} _ _ _ _ &in. _ _ _ _ ker &phi.

- contradiction. So \{&phi.#{~u}_{~i}\}_{~i = ~k+1 ... ~m} is a basis for im &phi. _ &imply. _ dim im &phi. = ~m - ~k .&square.

Associated Matrices

Let &phi.#: V_{~m} &hdash.&rightarrow. V_{~n} _ be a linear map, and \{#{~u}_{~j} \}_{~j = 1 ... ~m} a basis for V_{~m} , _ \{#{~v}_{~i}\}_{~i = 1 ... ~n} a basis for V_{~n}
As &phi.#{~u}_{~j} &in. V_{~n} , _ &exist. coordinates ~a_{~i ~j} such that _ &phi.#{~u}_{~j} = &sum._{~i} ~a_{~i ~j} #{~v}_{~i} _ (note order of indices).

This defines an ~n×~m matrix:

A _ = _ matrix{ ~a_{1 1}, ... ,~a_{1 ~m}/:, ... ,:/~a_{~n 1}, ... ,~a_{~n ~m}}

for any element in V_{~m}, _ #{~u} = &sum._{~j} &eta._{~j} #{~u}_{~j} , _ &phi.#{~u} = &sum._{~i} &mu._{~i}#{~v}_{~i} _ say.
But _ &phi.#{~u} _ = _ &phi.&sum._{~j} &eta._{~j} #{~u}_{~j} _ = _ &sum._{~j} &eta._{~j} &phi.#{~u}_{~j} _ = _ &sum._{~j} &eta._{~j} &sum._{~i} ~a_{~i ~j} #{~v}_{~i} _ = _ &sum._{~i} (&sum._{~j} ~a_{~i ~j} &eta._{~j} )#{~v}_{~i} .
So &mu._{~i} _ = _ &sum._{~j} ~a_{~i ~j} &eta._{~j} _ i.e. the coordinates of &phi.#{~u} are obtained by multiplying the column vector of the coordinates of #{~u} by the associated matrix:
_

matrix{&mu._1/:/&mu._{~n}} _ = _ matrix{~a_{1 1}, ... ,~a_{1 ~m}/., ... ,./~a_{~n 1}, ... ,~a_{~n ~m}} matrix{&eta._1/:/&eta._{~m}}

Given any ~n # ~m matrix we can define a linear map by this operation on the coordinates (with respect to given bases) and furthermore the matrix associated with this map (with respect to the same bases) will be the original matrix.
So &exist. a bijection: @L(~m,~n) &leftarrow.&rightarrow. @M(~n,~m). _ (~{Relative to a given pair of bases.})

Addition and Scalar Multiplication

Let &phi., &psi. &in. @L(~m,~n) have associated matrices A, B &in. @M(~n,~m), relative to given bases. Then if #{~u} &in. V_{~m}
(&phi. + &psi.)#{~u} _ = _ &phi.#{~u} + &psi.#{~u} _ = _ &sum._{~i}( &sum._{~j} ~a_{~i ~j} &eta._{~j} )#{~v}_{~i} + &sum._{~i}( &sum._{~j} ~b_{~i ~j} &eta._{~j} )#{~v}_{~i} _ = _ &sum._{~i}( &sum._{~j} (~a_{~i ~j} + ~b_{~i ~j} )&eta._{~j} )#{~v}_{~i}
So &phi. + &psi. has associated matrix A + B relative to the given bases.

Similarly λ&phi. has associated matrix λA. _ So the correspondence @L(~m,~n) &leftarrow.&rightarrow. @M(~n,~m) _ is isomorphic.

We saw that dim @M(~n,~m) = ~m # ~n , _ so dim @L(~m,~n) = ~m # ~n .

Composition

Now consider

V_{~m} zDgrmRight{&phi., A} V_{~n} zDgrmRight{&psi., B} V_{~p}
\{#{~u}_{~i}\} _ \{#{~v}_{~j} \} _ \{#{~w}_{~k}\}


with the associated matrices A and B as shown relative to bases \{#{~u}_{~i}\}, _ \{#{~v}_{~j} \}, _ \{#{~w}_{~k}\}, _ respectively. [Note order of indices, ~i ~j]

The #{~{composition}} of &phi. and &psi. , _ written &psi. &comp. &phi. is a linear map (&psi. &comp. &phi.)#: V_{~m} &hdash.&rightarrow. V_{~p} _ such that (&psi. &comp. &phi.)#{~u} = &psi.(&phi.#{~u}) .

Now _ _ (&psi. &comp. &phi.)#{~u}_{~i} _ = _ &psi.( &phi.#{~u}_{~i}) _ = _ &psi.( &sum._{~j} ~a_{~j~i}#{~v}_{~j} ) _ = _ &sum._{~j} ~a_{~j~i} (&psi.#{~v}_{~j} )

_ _ _ = _ &sum._{~j} ~a_{~j~i} (&sum._{~k} ~b_{~k ~j} #{~w}_{~k}) _ = _ &sum._{~k} (&sum._{~j} ~b_{~k ~j} ~a_{~j~i} )#{~w}_{~k}

So (&psi. &comp. &phi.) has associated matrix C, where C is ~p×~m, and ~c_{~k~i} = &sum._{~j} ~b_{~k ~j} ~a_{~j~i} .

But this is just the matrix product BA. So the matrix associated with the composite of linear maps is just the product of the matrices associated with the individual linear maps. [Note order of multiplication].

V_{~m} zDgrmRight{&psi. &comp. &phi., BA} "V_{~p}
\{#{~u}_{~i}\} _ \{#{~w}_{~k}\}

Change of Basis

Consider the identity map _ ~i_{~n}#: V_{~n} &hdash.&rightarrow. V_{~n} . _ This has associated matrix I_{~n} ( the identity matrix ) with respect to any basis, providing that it is the same basis in both the domain and co-domain of ~i_{~n}.

Suppose we have two different bases \{#{~u}_{~i}\} and \{#{~u'}_{~i}\} for V_{~n}, and that ~i_{~n} has associated matrix C with respect to \{#{~u}_{~i}\} and \{#{~u'}_{~i}\} and associated matrix D with respect to \{#{~u'}_{~i}\} and \{#{~u}_{~i}\} . _ The following diagram illustrates the situation:

V_{~n}
\{#{~u}\}
zDgrmRight{,C} V_{~n}
\{#{~u'}\}
zDgrmVtLine{I_{~n},} _ zDgrmVtLine{,I_{~n}}
V_{~n}
\{#{~u}\}
zDgrmLeft{,D} V_{~n}
\{#{~u'}\}
  All lines represent the identity map ~i_{~n} with associated matrices as shown with respect to the bases indicated.

It is clear from the diagram that I_{~n} = DC and I_{~n} = CD, so C is non-singular, C^{-1} = D.

Consider the linear map _ &phi.#: V_{~n} &hdash.&rightarrow. V_{~m} , _ with associated matrices A relative to bases \{#{~u}_{~i}\}, \{#{~v}_{~i}\} respectively, _ and B relative to bases \{#{~u'}_{~i}\}, \{#{~v'}_{~i}\} respectively.

V_{~n}
\{#{~u'}\}
zDgrmRight{i_{~n},N} V_{~n}
\{#{~u}\}
zDgrmDown{&phi.,B}   zDgrmDown{&phi.,A}
V_{~m}
\{#{~v'}\}
zDgrmLeft{i_{~m},M} V_{~m}
\{#{~v}\}
  From the above it can be seen that &exist. regular matrices, M (~m # ~m) and N (~n # ~n) corresponding to the respective identity maps, and from the composition rule it is clear that B = MAN (see diagram).

Equivalence of Matrices

Given any two ~m # ~n matrices A and B, then they are said to be #{~{equivalent}} if &exist. regular ~m # ~m matrix M and ~n # ~n matrix N such that B = MAN. write B &tilde. A. _ This is a proper equivalence relationship on @M(~m,~n):

  1. A &tilde. A _ (A = I_{~m} A I_{~n})
  2. B &tilde. A &imply. A &tilde. B _ (B=MAN &imply. M^{&minus.1}BN^{&minus.1}=M^{&minus.1}MANN^{&minus.1}=A)
  3. C &tilde. B and B &tilde. A &imply. C &tilde. A _ (C=SBT and B=MAN &imply. C=(SM)A(NT), _ SM and NT regular (SM)(M^{&minus.1}S^{&minus.1}) = I_{~m} etc.)

We have shown that two matrices are equivalent if they are associated with the same linear map relative to different sets of bases. Is the converse true? I.e. given two equivalent matrices is there a linear map with which they are both associated relative to different bases?

This question will be answered when we study the concept of rank in the next section.