3.5 KiB
Matrix algebra
Theorem: let
A, B
andC
be matrices and\alpha
and\beta
be scalars. Each of the following statements is valid
A + B = B + A
,(A + B) + C = A + (B + C)
,(AB)C = A(BC)
,A(B + C) = AB + AC
,(A + B)C = AC + BC
,(\alpha \beta) A = \alpha(\beta A)
,\alpha (AB) = (\alpha A)B = A (\alpha) B
,(\alpha + \beta)A = \alpha A + \beta A
,\alpha (A + B) = \alpha A + \alpha B
.
??? note "Proof:"
Will be added later.
In the case where an n \times n
matrix A
is multiplied by itself k
times it is convenient to use exponential notation: AA \cdots A = A^k
.
Definition: the
n \times n
identity matrix is the matrixI = (\delta_{ij})
, where
\delta_{ij} = \begin{cases} 1 &\text{ if } i = j, \ 0 &\text{ if } i \neq j.\end{cases}
Obtaining for the multiplication of a n \times n
matrix A
with the identitiy matrix; A I = A
.
Definition: an
n \times n
matrixA
is said to be nonsingular or invertible if there exists a matrixA^{-1}
such thatAA^{-1} = A^{-1}A = I
. The matrixA^{-1}
is said to be a multiplicative inverse ofA
.
If B
and C
are both multiplicative inverses of A
then
B = BI = B(AC) = (BA)C = IC = C,
thus a matrix can have at most one multiplicative inverse.
Definition: an
n \times n
matrix is said to be singular if it does not have a multiplicative inverse.
Or similarly, an n \times n
matrix A
is singular if A \mathbf{x} = \mathbf{0}
for some non trivial \mathbf{x} \in \mathbb{R}^n \backslash \{\mathbf{0}\}
. For a nonsingular matrix A
, \mathbf{x} = \mathbf{0}
is the only solution to A \mathbf{x} = \mathbf{0}
.
Theorem: if
A
andB
are nonsingularn \times n
matrices, thenAB
is also nonsingular and
(AB)^{-1} = B^{-1} A^{-1}.
??? note "Proof:"
Let $A$ and $B$ be nonsingular $n \times n$ matrices. If we suppose $AB$ is nonsingular and $(AB)^{-1} = B^{-1} A^{-1}$ we have
$$
(AB)^{-1}AB = (B^{-1} A^{-1})AB = B^{-1} (A^{-1} A) B = B^{-1} B = I, \\
AB(AB)^{-1} = AB(B^{-1} A^{-1}) = A (B B^{-1}) A^{-1} = A A^{-1} = I.
$$
Theorem: let
A
be a nonsingularn \times n
matrix, the inverse ofA
given byA^{-1}
is nonsingular.
??? note "Proof:"
Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x} \in \mathbb{R}^n$ a vector. Suppose $A^{-1} \mathbf{x} = \mathbf{0}$ then
$$
\mathbf{x} = I \mathbf{x} = (A A^{-1}) \mathbf{x} = A(A^{-1} \mathbf{x}) = \mathbf{0}.
$$
Theorem: let
A
be a nonsingularn \times n
matrix then the solution of the systemA\mathbf{x} = \mathbf{b}
is\mathbf{x} = A^{-1} \mathbf{b}
with\mathbf{x}, \mathbf{b} \in \mathbb{R}^n
.
??? note "Proof:"
Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x}, \mathbf{b} \in \mathbb{R}^n$ vectors. Suppose $\mathbf{x} = A^{-1} \mathbf{b}$ then we have
$$
A \mathbf{x} = A (A^{-1} \mathbf{b}) = (A A^{-1}) \mathbf{b} = \mathbf{b}.
$$
Corollary: the system
A \mathbf{x} = \mathbf{b}
ofn
linear equations inn
unknowns has a unique solution if and only ifA
is nonsingular.
??? note "Proof:"
The proof follows from the above theorem.
Theorem: let
A
andB
be matrices and\alpha
and\beta
be scalars. Each of the following statements valid
(A^T)^T = A
,(\alpha A)^T = \alpha A^T
,(A + B)^T = A^T + B^T
,(AB)^T = B^T A^T
.
??? note "Proof:"
Will be added later.