> *Definition 1*: two subspaces $S$ and $T$ of an inner product space $V$ are **orthogonal** if
>
> $$
> \langle \mathbf{u}, \mathbf{v} \rangle = 0,
> $$
>
> for all $\mathbf{u} \in S$ and $\mathbf{v} \in T$. Orthogonality of $S$ and $T$ may be denoted by $S \perp T$.
The notion of orthogonality is only valid in vector spaces with a defined inner product.
> *Definition 2*: let $S$ be a subspace of an inner product space $V$. The set of all vectors in $V$ that are orthogonal to every vector in $S$ will be denoted by $S^\perp$. Which implies
>
> $$
> S^\perp = \{\mathbf{v} \in V \;|\; \langle \mathbf{v}, \mathbf{u} \rangle = 0 \; \forall \mathbf{u} \in S \}.
> $$
>
> The set $S^\perp$ is called the **orthogonal complement** of $S$.
For example the subspaces $X = \mathrm{span}(\mathbf{e}_1)$ and $Y = \mathrm{span}(\mathbf{e}_2)$ of $\mathbb{R}^3$ are orthogonal, but they are not orthogonal complements. Indeed,
We may observe that if $S$ and $T$ are orthogonal subspaces of an inner product space $V$, then $S \cap T = \{\mathbf{0}\}$. Since for $\mathbf{v} \in S \cap T$ and $S \perp T$ then $\langle \mathbf{v}, \mathbf{v} \rangle = 0$ and hence $\mathbf{v} = \mathbf{0}$.
Additionally, we may also observe that if $S$ is a subspace of an inner product space $V$, then $S^\perp$ is also a subspace of $V$. Since for $\mathbf{u} \in S^\perp$ and $a \in \mathbb{K}$ then
$$
\langle a \mathbf{u}, \mathbf{v} \rangle = a \cdot 0 = 0
$$
for all $\mathbf{v} \in S$, therefore $a \mathbf{u} \in S^\perp$.
for all $\mathbf{v} \in S$, and hence $\mathbf{u}_1 + \mathbf{u}_2 \in S^\perp$. Therefore $S^\perp$ is a subspace of $V$.
### Fundamental subspaces
Let $V$ be an Euclidean inner product space $V = \mathbb{R}^n$ with its inner product defined by the [scalar product](inner-product-spaces/#euclidean-inner-product-spaces). With this definition of the inner product on $V$ the following theorem may be posed.
> *Theorem 1*: let $A$ be an $m \times n$ matrix, then
>
> $$
> N(A) = R(A^T)^\perp,
> $$
>
> and
>
> $$
> N(A^T) = R(A)^\perp,
> $$
>
> for all $A \in \mathbb{R}^{m \times n}$ with $R(A)$ denoting the column space of $A$ and $R(A^T)$ denoting the row space of $A$.
??? note "*Proof*:"
Let $A \in \mathbb{R}^{m \times n}$ with $R(A) = \mathrm{span}(\mathbf{\vec{a}}_i^T)$ for $i \in \mathbb{N}[i \leq n]$ denoting the column space of $A$ and $R(A^T) = \mathrm{span}(\mathbf{a}_i)$ for $i \in \mathbb{N}[i \leq m]$ denoting the row space of $A$.
For the first equation, let $\mathbf{v} \in R(A^T)^\perp$ then $\mathbf{v}^T \mathbf{\vec{a}}_i^T = \mathbf{0}$ which obtains
so $A \mathbf{v} = \mathbf{0}$ and hence $\mathbf{v} \in N(A)$. Which implies that $R(A^T)^\perp \subseteq N(A)$. Similarly, let $\mathbf{w} \in N(A)$ then $A \mathbf{w} = \mathbf{0}$ which obtains
so $A^T \mathbf{v} = \mathbf{0}$ and hence $\mathbf{v} \in N(A^T)$. Which implies that $R(A)^\perp \subseteq N(A^T)$. Similarly, let $\mathbf{w} \in N(A^T)$ then $A^T \mathbf{w} = \mathbf{0}$ which obtains
and hence $\mathbf{w} \in R(A)^\perp$ which implies that $N(A^T) \subseteq R(A)^\perp$. Therefore $N(A^T) = R(A)^\perp$.
Known as the fundamental theorem of linear algebra. Which can be used to prove the following theorem.
> *Theorem 2*: if $S$ is a subspace of the inner product space $V = \mathbb{R}^n$, then
>
> $$
> \dim S + \dim S^\perp = n.
> $$
>
> Furthermore, if $\{\mathbf{v}_i\}_{i=1}^r$ is a basis of $S$ and $\{\mathbf{v}_i\}_{i=r+1}^n$ is a basis of $S^\perp$ then $\{\mathbf{v}_i\}_{i=1}^n$ is a basis of $V$.
??? note "*Proof*:"
If $S = \{\mathbf{0}\}$, then $S^\perp = V$ and
$$
\dim S + \dim S^\perp = 0 + n = n.
$$
If $S \neq \{\mathbf{0}\}$, then let $\{\mathbf{x}_i\}_{i=1}^r$ be a basis of $S$ and define $X \in \mathbb{R}^{r \times m}$ whose $i$th row is $\mathbf{x}_i^T$ for each $i$. Matrix $X$ has rank $r$ and $R(X^T) = S$. Then by theorem 2
$$
S^\perp = R(X^T)^\perp = N(X),
$$
from the [rank nullity theorem](vector-spaces/#rank-and-nullity) it follows that
$$
\dim S^\perp = \dim N(X) = n - r.
$$
and therefore
$$
\dim S + \dim S^\perp = r + n - r = n.
$$
Let $\{\mathbf{v}_i\}_{i=1}^r$ be a basis of $S$ and $\{\mathbf{v}_i\}_{i=r+1}^n$ be a basis of $S^\perp$. Suppose that
Let $\mathbf{u} = c_1 \mathbf{v}_1 + \dots + c_r \mathbf{v}_r$ and let $\mathbf{w} = c_{r+1} \mathbf{v}_{r+1} + \dots + c_n \mathbf{v}_n$. Then we have
$$
\mathbf{u} + \mathbf{w} = \mathbf{0},
$$
implies $\mathbf{u} = - \mathbf{w}$ and thus both elements must be in $S \cap S^\perp$. However, $S \cap S^\perp = \{\mathbf{0}\}$, therefore
since $\{\mathbf{v}_i\}_{i=1}^r$ and $\{\mathbf{v}_i\}_{i=r+1}^n$ are linearly independent, we must also have that $\{\mathbf{v}_i\}_{i=1}^n$ are linearly independent and therefore form a basis of $V$.
> *Proposition 1*: let $S$ be a subspace of $V$, then $(S^\perp)^\perp = S$.
??? note "*Proof*:"
Will be added later.
> *Proposition 2*: let $A \in \mathbb{R}^{m \times n}$ and $\mathbf{b} \in \mathbb{R}^m$, then either there is a vector $\mathbf{x} \in \mathbb{R}^n$ such that
>
> $$
> A \mathbf{x} = \mathbf{b},
> $$
>
> or there is a vector $\mathbf{y} \in \mathbb{R}^m$ such that
In working with an inner product space $V$, it is generally desirable to have a basis of mutually orthogonal unit vectors.
> *Definition 3*: the set of vectors $\{\mathbf{v}_i\}_{i=1}^n$ in an inner product space $V$ is **orthogonal** if
>
> $$
> \langle \mathbf{v}_i, \mathbf{v}_j \rangle = 0,
> $$
>
> whenever $i \neq j$. Then $\{\mathbf{v}_i\}_{i=1}^n$ is said to be an **orthogonal set** of vectors.
For example the trivial set $\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3$ is an orthogonal set in $\mathbb{R}^3$.
> *Theorem 3*: if $\{\mathbf{v}_i\}_{i=1}^n$ is an orthogonal set of nonzero vectors in an inner product space $V$, then $\{\mathbf{v}_i\}_{i=1}^n$ are linearly independent.
??? note "*Proof*:"
Suppose that $\{\mathbf{v}_i\}_{i=1}^n$ is an orthogonal set of nonzero vectors in an inner product space $V$ and
In particular, if $\mathbf{u} = \mathbf{v}$ then $\|Q \mathbf{u}\|^2 = \|\mathbf{u}\|^2$ and hence $\|Q \mathbf{u}\| = \|\mathbf{u}\|$. Multiplication by an orthogonal matrix preserves the lengths of vectors.
## Orthogonalization process
Let $\{\mathbf{a}_i\}_{i=1}^n$ be a basis of an inner product space $V$. We may use the modified method of Gram-Schmidt to determine the orthonormal basis $\{\mathbf{q}_i\}_{i=1}^n$ of $V$.
Let $\mathbf{q}_1 = \frac{1}{\|\mathbf{a}_1\|} \mathbf{a}_1$ be the first step.
Then we may induce the following step for $i \in \mathrm{range}(2,n)$: