Finished section matrices in linear algebra.
This commit is contained in:
parent
4e6cd51169
commit
ed1aeb467d
3 changed files with 164 additions and 1 deletions
|
@ -84,6 +84,7 @@ nav:
|
||||||
- 'Matrices':
|
- 'Matrices':
|
||||||
- 'Matrix arithmetic': mathematics/linear-algebra/matrices/matrix-arithmetic.md
|
- 'Matrix arithmetic': mathematics/linear-algebra/matrices/matrix-arithmetic.md
|
||||||
- 'Matrix algebra': mathematics/linear-algebra/matrices/matrix-algebra.md
|
- 'Matrix algebra': mathematics/linear-algebra/matrices/matrix-algebra.md
|
||||||
|
- 'Elementary matrices': mathematics/linear-algebra/matrices/elementary-matrices.md
|
||||||
- 'Calculus':
|
- 'Calculus':
|
||||||
- 'Limits': mathematics/calculus/limits.md
|
- 'Limits': mathematics/calculus/limits.md
|
||||||
- 'Continuity': mathematics/calculus/continuity.md
|
- 'Continuity': mathematics/calculus/continuity.md
|
||||||
|
|
|
@ -0,0 +1,69 @@
|
||||||
|
# Elementary matrices
|
||||||
|
|
||||||
|
> *Definition*: an *elementary* matrix is defined as an identity matrix with exactly one elementary row operation undergone.
|
||||||
|
>
|
||||||
|
> 1. An elementary matrix of type 1 $E_1$ is obtained by changing two rows $I$.
|
||||||
|
> 2. An elementary matrix of type 2 $E_2$ is obtained by multiplying a row of $I$ by a nonzero constant.
|
||||||
|
> 3. An elementary matrix of type 3 $E_3$ is obtained from $I$ by adding a multiple of one row to another row.
|
||||||
|
|
||||||
|
For example the elementary matrices could be given by
|
||||||
|
|
||||||
|
$$
|
||||||
|
E_1 = \begin{pmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1\end{pmatrix}, \qquad E_2 = \begin{pmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 3\end{pmatrix}, \qquad E_3 = \begin{pmatrix}1 & 0 & 3\\ 0 & 1 & 0\\ 0 & 0 & 1\end{pmatrix}.
|
||||||
|
$$
|
||||||
|
|
||||||
|
> *Theorem*: if $E$ is an elementary matrix, then $E$ is nonsingular and $E^{-1}$ is an elementary matrix of the same type.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
If $E$ is the elementary matrix of type 1 formed from $I$ by interchanging the $i$th and $j$th rows, then $E$ can be transfomred back into $I$ by interchanging these same rows again. Therefore, $EE = I$ and hence $E$ is its own inverse.
|
||||||
|
|
||||||
|
IF $E$ is the elementray matrix of type 2 formed by multiplying the $i$th row of $I$ by a nonzero scalar $\alpha$ then $E$ can be transformed into the identity matrix by multiplying either its $i$th row or its $i$th column by $1/\alpha$.
|
||||||
|
|
||||||
|
If $E$ is the elemtary matrix of type 3 formed from $I$ by adding $m$ times the $i$th row to the $j$th row then $E$ can be transformed back into $I$ either by subtracting $m$ times the $i$th row from the $j$th row or by subtracting $m$ times the $j$th column from the $i$th column.
|
||||||
|
|
||||||
|
> *Definition*: a matrix $B$ is **row equivalent** to a matrix $A$ if there exists a finite sequence $E_1, E_2, \dots, E_K$ of elementary matrices with $k \in \mathbb{N}$ such that
|
||||||
|
>
|
||||||
|
> $$
|
||||||
|
> B = E_k E_{k-1} \cdots E_1 A.
|
||||||
|
> $$
|
||||||
|
|
||||||
|
It may be observed that row equivalence is a reflexive, symmetric and transitive relation.
|
||||||
|
|
||||||
|
> *Theorem*: let $A$ be an $n \times n$ matrix, the following are equivalent
|
||||||
|
>
|
||||||
|
> 1. $A$ is nonsingular,
|
||||||
|
> 2. $A\mathbf{x} = \mathbf{0}$ has only the trivial solution $\mathbf{0}$,
|
||||||
|
> 3. $A$ is row equivalent to $I$.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Let $A$ be a nonsingular $n \times n$ matrix and $\mathbf{\hat x}$ is a solution of $A \mathbf{x} = \mathbf{0}$ then
|
||||||
|
|
||||||
|
$$
|
||||||
|
\mathbf{\hat x} = I \mathbf{\hat x} = (A^{-1} A)\mathbf{\hat x} = A^{-1} (A \mathbf{\hat x}) = A^{-1} \mathbf{0} = \mathbf{0}.
|
||||||
|
$$
|
||||||
|
|
||||||
|
Let $U$ be the row echelon form of $A$. If one of the diagonal elements of $U$ were 0, the last row of $U$ would consist entirely of zeros. But then $A \mathbf{x} = \mathbf{0}$ would have a nontrivial solution. Thus $U$ must be a strictly triangular matrix with diagonal elements all equal to 1. It then follows that $I$ is the reduced row echelon form of $A$ and hence $A$ is row equivalent to $I$.
|
||||||
|
|
||||||
|
If $A$ is row equivalent to $I$ there exists elementary matrices $E_1, E_2, \dots, E_k$ with $k \in \mathbb{N}$ such that
|
||||||
|
|
||||||
|
$$
|
||||||
|
A = E_k E_{k-1} \cdots E_1 I = E_k E_{k-1} \cdots E_1.
|
||||||
|
$$
|
||||||
|
|
||||||
|
Since $E_i$ is invertible for $i \in \{1, \dots, k\}$ the product $E_k E_{k-1} \cdots E_1$ is also invertible, hence $A$ is nonsingular.
|
||||||
|
|
||||||
|
If $A$ is nonsingular then $A$ is row equivalent to $I$ and hence there exists elemtary matrices $E_1, \dots, E_k$ such that
|
||||||
|
|
||||||
|
$$
|
||||||
|
E_k E_{k-1} \cdots E_1 A = I,
|
||||||
|
$$
|
||||||
|
|
||||||
|
multiplyting both sides on the right by $A^{-1}$ obtains
|
||||||
|
|
||||||
|
$$
|
||||||
|
E_k E_{k-1} \cdots E_1 = A^{-1}
|
||||||
|
$$
|
||||||
|
|
||||||
|
a method for computing $A^{-1}$.
|
|
@ -1 +1,94 @@
|
||||||
# Matrix algebra
|
# Matrix algebra
|
||||||
|
|
||||||
|
> *Theorem*: let $A, B$ and $C$ be matrices and $\alpha$ and $\beta$ be scalars. Each of the following statements is valid
|
||||||
|
>
|
||||||
|
> 1. $A + B = B + A$,
|
||||||
|
> 2. $(A + B) + C = A + (B + C)$,
|
||||||
|
> 3. $(AB)C = A(BC)$,
|
||||||
|
> 4. $A(B + C) = AB + AC$,
|
||||||
|
> 5. $(A + B)C = AC + BC$,
|
||||||
|
> 6. $(\alpha \beta) A = \alpha(\beta A)$,
|
||||||
|
> 7. $\alpha (AB) = (\alpha A)B = A (\alpha) B$,
|
||||||
|
> 8. $(\alpha + \beta)A = \alpha A + \beta A$,
|
||||||
|
> 9. $\alpha (A + B) = \alpha A + \alpha B$.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Will be added later.
|
||||||
|
|
||||||
|
In the case where an $n \times n$ matrix $A$ is multiplied by itself $k$ times it is convenient to use exponential notation: $AA \cdots A = A^k$.
|
||||||
|
|
||||||
|
> *Definition*: the $n \times n$ **identity matrix** is the matrix $I = (\delta_{ij})$, where
|
||||||
|
>
|
||||||
|
> $$
|
||||||
|
> \delta_{ij} = \begin{cases} 1 &\text{ if } i = j, \\ 0 &\text{ if } i \neq j.\end{cases}
|
||||||
|
> $$
|
||||||
|
|
||||||
|
Obtaining for the multiplication of a $n \times n$ matrix $A$ with the identitiy matrix; $A I = A$.
|
||||||
|
|
||||||
|
> *Definition*: an $n \times n$ matrix $A$ is said to be **nonsingular** or **invertible** if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$. The matrix $A^{-1}$ is said to be a **multiplicative inverse** of $A$.
|
||||||
|
|
||||||
|
If $B$ and $C$ are both multiplicative inverses of $A$ then
|
||||||
|
|
||||||
|
$$
|
||||||
|
B = BI = B(AC) = (BA)C = IC = C,
|
||||||
|
$$
|
||||||
|
|
||||||
|
thus a matrix can have at most one multiplicative inverse.
|
||||||
|
|
||||||
|
> *Definition*: an $n \times n$ matrix is said to be **singular** if it does not have a multiplicative inverse.
|
||||||
|
|
||||||
|
Or similarly, an $n \times n$ matrix $A$ is singular if $A \mathbf{x} = \mathbf{0}$ for some non trivial $\mathbf{x} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$. For a nonsingular matrix $A$, $\mathbf{x} = \mathbf{0}$ is the only solution to $A \mathbf{x} = \mathbf{0}$.
|
||||||
|
|
||||||
|
> *Theorem*: if $A$ and $B$ are nonsingular $n \times n$ matrices, then $AB$ is also nonsingular and
|
||||||
|
>
|
||||||
|
> $$
|
||||||
|
> (AB)^{-1} = B^{-1} A^{-1}.
|
||||||
|
> $$
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Let $A$ and $B$ be nonsingular $n \times n$ matrices. If we suppose $AB$ is nonsingular and $(AB)^{-1} = B^{-1} A^{-1}$ we have
|
||||||
|
|
||||||
|
$$
|
||||||
|
(AB)^{-1}AB = (B^{-1} A^{-1})AB = B^{-1} (A^{-1} A) B = B^{-1} B = I, \\
|
||||||
|
AB(AB)^{-1} = AB(B^{-1} A^{-1}) = A (B B^{-1}) A^{-1} = A A^{-1} = I.
|
||||||
|
$$
|
||||||
|
|
||||||
|
> *Theorem*: let $A$ be a nonsingular $n \times n$ matrix, the inverse of $A$ given by $A^{-1}$ is nonsingular.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x} \in \mathbb{R}^n$ a vector. Suppose $A^{-1} \mathbf{x} = \mathbf{0}$ then
|
||||||
|
|
||||||
|
$$
|
||||||
|
\mathbf{x} = I \mathbf{x} = (A A^{-1}) \mathbf{x} = A(A^{-1} \mathbf{x}) = \mathbf{0}.
|
||||||
|
$$
|
||||||
|
|
||||||
|
> *Theorem*: let $A$ be a nonsingular $n \times n$ matrix then the solution of the system $A\mathbf{x} = \mathbf{b}$ is $\mathbf{x} = A^{-1} \mathbf{b}$ with $\mathbf{x}, \mathbf{b} \in \mathbb{R}^n$.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x}, \mathbf{b} \in \mathbb{R}^n$ vectors. Suppose $\mathbf{x} = A^{-1} \mathbf{b}$ then we have
|
||||||
|
|
||||||
|
$$
|
||||||
|
A \mathbf{x} = A (A^{-1} \mathbf{b}) = (A A^{-1}) \mathbf{b} = \mathbf{b}.
|
||||||
|
$$
|
||||||
|
|
||||||
|
> *Corollary*: the system $A \mathbf{x} = \mathbf{b}$ of $n$ linear equations in $n$ unknowns has a unique solution if and only if $A$ is nonsingular.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
The proof follows from the above theorem.
|
||||||
|
|
||||||
|
> *Theorem*: let $A$ and $B$ be matrices and $\alpha$ and $\beta$ be scalars. Each of the following statements valid
|
||||||
|
>
|
||||||
|
> 1. $(A^T)^T = A$,
|
||||||
|
> 2. $(\alpha A)^T = \alpha A^T$,
|
||||||
|
> 3. $(A + B)^T = A^T + B^T$,
|
||||||
|
> 4. $(AB)^T = B^T A^T$.
|
||||||
|
|
||||||
|
??? note "*Proof*:"
|
||||||
|
|
||||||
|
Will be added later.
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue