1
0
Fork 0

Removed minor mistake in orthogonality section.

This commit is contained in:
Luc Bijl 2024-04-20 14:07:36 +02:00
parent 0003318cb7
commit 021b773c29

View file

@ -205,7 +205,7 @@ Recall that the system $A \mathbf{x} = \mathbf{b}$ is consistent if and only if
In working with an inner product space $V$, it is generally desirable to have a basis of mutually orthogonal unit vectors.
> *Definition 3*: the set of vectors $\{\mathbf{v}_i\}_{i=1}^n$ in an inner product space $V$ is **orthogonal** if
> *Definition 4*: the set of vectors $\{\mathbf{v}_i\}_{i=1}^n$ in an inner product space $V$ is **orthogonal** if
>
> $$
> \langle \mathbf{v}_i, \mathbf{v}_j \rangle = 0,
@ -215,7 +215,7 @@ In working with an inner product space $V$, it is generally desirable to have a
For example the trivial set $\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3$ is an orthogonal set in $\mathbb{R}^3$.
> *Theorem 3*: if $\{\mathbf{v}_i\}_{i=1}^n$ is an orthogonal set of nonzero vectors in an inner product space $V$, then $\{\mathbf{v}_i\}_{i=1}^n$ are linearly independent.
> *Theorem 4*: if $\{\mathbf{v}_i\}_{i=1}^n$ is an orthogonal set of nonzero vectors in an inner product space $V$, then $\{\mathbf{v}_i\}_{i=1}^n$ are linearly independent.
??? note "*Proof*:"
@ -235,7 +235,7 @@ For example the trivial set $\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3$ is an ort
We may even go further and define a set of vectors that are orthogonal and have a length of $1$, a unit vector by definition.
> *Definition 4*: an **orthonormal** set of vectors is an orthogonal set of unit vectors.
> *Definition 5*: an **orthonormal** set of vectors is an orthogonal set of unit vectors.
For example the set $\{\mathbf{u}_i\}_{i=1}^n$ will be orthonormal if and only if
@ -249,7 +249,7 @@ $$
\delta_{ij} = \begin{cases} 1 &\text{ for } i = j, \\ 0 &\text{ for } i \neq j.\end{cases}
$$
> *Theorem 4*: let $\{\mathbf{u}_i\}_{i=1}^n$ be an orthonormal basis of an inner product space $V$. If
> *Theorem 5*: let $\{\mathbf{u}_i\}_{i=1}^n$ be an orthonormal basis of an inner product space $V$. If
>
> $$
> \mathbf{v} = \sum_{i=1}^n c_i \mathbf{u}_i,
@ -301,7 +301,7 @@ Implying that it is much easier to calculate the coordinates of a given vector w
\mathbf{w} = \sum_{i=1}^n b_i \mathbf{u}_i,
$$
by theorem 4 we have
by theorem 5 we have
$$
\langle \mathbf{v}, \mathbf{w} \rangle = \Big\langle \sum_{i=1}^n a_i \mathbf{u}_i, \mathbf{w} \Big\rangle = \sum_{i=1}^n a_i \langle \mathbf{w}, \mathbf{u}_i \rangle = \sum_{i=1}^n a_i b_i.
@ -335,7 +335,7 @@ Implying that it is much easier to calculate the coordinates of a given vector w
### Orthogonal matrices
> *Definition 5*: an $n \times n$ matrix $Q$ is an **orthogonal matrix** if
> *Definition 6*: an $n \times n$ matrix $Q$ is an **orthogonal matrix** if
>
> $$
> Q^T Q = I.
@ -343,7 +343,7 @@ Implying that it is much easier to calculate the coordinates of a given vector w
Orthogonal matrices have column vectors that form an orthonormal set in $V$, as may be posed in the following theorem.
> *Theorem 5*: let $Q = (\mathbf{q}_1, \dots, \mathbf{q}_n)$ be an orthogonal matrix, then $\{\mathbf{q}_i\}_{i=1}^n$ is an orthonormal set.
> *Theorem 6*: let $Q = (\mathbf{q}_1, \dots, \mathbf{q}_n)$ be an orthogonal matrix, then $\{\mathbf{q}_i\}_{i=1}^n$ is an orthonormal set.
??? note "*Proof*:"
@ -408,7 +408,7 @@ $$
We wish to find a vector $\mathbf{x} \in \mathbb{R}^n$ for which $\|\mathbf{r}(\mathbf{x})\|$ will be a minimum. A solution $\mathbf{\hat x}$ that minimizes $\|\mathbf{r}(\mathbf{x})\|$ is a *least squares solution* of the system $A \mathbf{x} = \mathbf{b}$. Do note that minimizing $\|\mathbf{r}(\mathbf{x})\|$ is equivalent to minimizing $\|\mathbf{r}(\mathbf{x})\|^2$.
> *Theorem 6*: let $S$ be a subspace of $\mathbb{R}^m$. For each $b \in \mathbb{R}^m$, there exists a unique $\mathbf{p} \in S$ that suffices
> *Theorem 7*: let $S$ be a subspace of $\mathbb{R}^m$. For each $b \in \mathbb{R}^m$, there exists a unique $\mathbf{p} \in S$ that suffices
>
> $$
> \|\mathbf{b} - \mathbf{s}\| > \|\mathbf{b} - \mathbf{p}\|,
@ -440,7 +440,7 @@ $$
Uniqueness of $\mathbf{\hat x}$ can be obtained if $A^T A$ is nonsingular which will be posed in the following theorem.
> *Theorem 7*: let $A \in \mathbb{R}^{m \times n}$ be an $m \times n$ matrix with rank $n$, then $A^T A$ is nonsingular.
> *Theorem 8*: let $A \in \mathbb{R}^{m \times n}$ be an $m \times n$ matrix with rank $n$, then $A^T A$ is nonsingular.
??? note "*Proof*:"