4.8 KiB
Linear transformations
Definition
Definition: let
V
andW
be vector spaces, a mappingL: V \to W
is a linear transformation or linear map if
L(\lambda \mathbf{v}_1 + \mu \mathbf{v}_2) = \lambda L(\mathbf{v}_1) + \mu L(\mathbf{v}_2),
for all
\mathbf{v}_{1,2} \in V
and\lambda, \mu \in \mathbb{K}
.
A linear transformation may also be called a vector space homomorphism. If the linear transformation is a bijection then it may be called a linear isomorphism.
In the case that the vector spaces V
and W
are the same; V=W
, a linear transformation L: V \to V
will be referred to as a linear operator on V
or linear endomorphism .
The image and kernel
Let L: V \to W
be a linear transformation from a vector space V
to a vector space W
. In this section the effect is considered that L
has on subspaces of V
. Of particular importance is the set of vectors in V
that get mapped into the zero vector of W
.
Definition: let
L: V \to W
be a linear transformation. The kernel ofL
, denoted by\ker(L)
, is defined by
\ker(L) = {\mathbf{v} \in V ;|; L(\mathbf{v}) = \mathbf{0}}.
The kernel is therefore a set consisting of vectors in V
that get mapped into the zero vector of W
.
Definition: let
L: V \to W
be a linear transformation and letS
be a subspace ofV
. The image ofS
, denoted byL(S)
, is defined by
L(S) = {\mathbf{w} \in W ;|; \mathbf{w} = L(\mathbf{v}) \text{ for } \mathbf{v} \in S }.
The image of the entire vector space
L(V)
, is called the range ofL
.
With these definitions the following theorem may be posed.
Theorem: if
L: V \to W
is a linear transformation andS
is a subspace ofV
, then
\ker(L)
is a subspace ofV
.L(S)
is a subspace ofW
.
??? note "Proof:"
Let $L: V \to W$ be a linear transformation and $S$ is a subspace of $V$.
To prove 1, let $\mathbf{v}_{1,2} \in \ker(L)$ and let $\lambda, \mu \in \mathbb{K}$. Then
$$
L(\lambda \mathbf{v}_1 + \mu \mathbf{v}_2) = \lambda L(\mathbf{v}_1) + \mu L(\mathbf{v}_2) = \lambda \mathbf{0} + \mu \mathbf{0} = \mathbf{0},
$$
therefore $\lambda \mathbf{v}_1 + \mu \mathbf{v}_2 \in \ker(L)$ and hence $\ker(L)$ is a subspace of $V$.
To prove 2, let $\mathbf{w}_{1,2} \in L(S)$ then there exist $\mathbf{v}_{1,2} \in S$ such that $\mathbf{w}_{1,2} = L(\mathbf{v}_{1,2})$ For any $\lambda, \mu \in \mathbb{K}$ we have
$$
\lambda \mathbf{w}_1 + \mu \mathbf{w}_2 = \lambda L(\mathbf{v}_1) + \mu L(\mathbf{v}_2) = L(\lambda \mathbf{v}_1 + \mu \mathbf{v}_2),
$$
since $\lambda \mathbf{v}_1 + \mu \mathbf{v}_2 \in S$ it follows that $\lambda \mathbf{w}_1 + \mu \mathbf{w}_2 \in L(S)$ and hence $L(S)$ is a subspace of $W$.
Matrix representations
Theorem: let
L: \mathbb{R}^n \to \mathbb{R}^m
be a linear transformation, then there is anm \times n
matrixA
such that
L(\mathbf{x}) = A \mathbf{x},
for all
x \in \mathbb{R}^n
. With the $i$th column vector ofA
given by
\mathbf{a}_i = L(\mathbf{e}_i),
for a basis
\{\mathbf{e}_1, \dots, \mathbf{e}_n\} \subset \mathbb{R}^n
andi \in \{1, \dots, n\}
.
??? note "Proof:"
For $i \in \{1, \dots, n\}$, define
$$
\mathbf{a}_i = L(\mathbf{e}_i),
$$
and let
$$
A = (\mathbf{a}_1, \dots, \mathbf{a}_n).
$$
If $\mathbf{x} = x_1 \mathbf{e}_1 + \dots + x_n \mathbf{e}_n$ is an arbitrary element of $\mathbb{R}^n$, then
$$
\begin{align*}
L(\mathbf{x}) &= x_1 L(\mathbf{e}_1) + \dots + x_n L(\mathbf{e}_n), \\
&= x_1 \mathbf{a}_1 + \dots + x_n \mathbf{a}_n, \\
&= A \mathbf{x}.
\end{align*}
$$
It has therefore been established that each linear transformation from \mathbb{R}^n
to \mathbb{R}^m
can be represented in terms of an m \times n
matrix.
Theorem: let
E = \{\mathbf{e}_1, \dots, \mathbf{e}_n\}
andF = \{\mathbf{f}_1, \dots, \mathbf{f}_n\}
be two ordered bases for a vector spaceV
, and letL: V \to V
be a linear operator onV
,\dim V = n \in \mathbb{N}
. LetS
be then \times n
transition matrix representing the change fromF
toE
,
\mathbf{e}_i = S \mathbf{f}_i,
for
i \in \mathbb{N}; i\leq n
.If
A
is the matrix representingL
with respect toE
, andB
is the matrix representingL
with respect toF
, then
B = S^{-1} A S.
??? note "Proof:"
Will be added later.
Definition: let
A
andB
ben \times n
matrices.B
is said to be similar toA
if there exists a nonsingular matrixS
such thatB = S^{-1} A S
.
It follows from the above theorem that if A
and B
are n \times n
matrices representing the same operator L
, then A
and B
are similar.