1
0
Fork 0
mathematics-physics-wiki/docs/en/mathematics/linear-algebra/matrices/matrix-algebra.md

3.5 KiB

Matrix algebra

Theorem: let A, B and C be matrices and \alpha and \beta be scalars. Each of the following statements is valid

  1. A + B = B + A,
  2. (A + B) + C = A + (B + C),
  3. (AB)C = A(BC),
  4. A(B + C) = AB + AC,
  5. (A + B)C = AC + BC,
  6. (\alpha \beta) A = \alpha(\beta A),
  7. \alpha (AB) = (\alpha A)B = A (\alpha) B,
  8. (\alpha + \beta)A = \alpha A + \beta A,
  9. \alpha (A + B) = \alpha A + \alpha B.

??? note "Proof:"

Will be added later.

In the case where an n \times n matrix A is multiplied by itself k times it is convenient to use exponential notation: AA \cdots A = A^k.

Definition: the n \times n identity matrix is the matrix I = (\delta_{ij}), where

\delta_{ij} = \begin{cases} 1 &\text{ if } i = j, \ 0 &\text{ if } i \neq j.\end{cases}

Obtaining for the multiplication of a n \times n matrix A with the identitiy matrix; A I = A.

Definition: an n \times n matrix A is said to be nonsingular or invertible if there exists a matrix A^{-1} such that AA^{-1} = A^{-1}A = I. The matrix A^{-1} is said to be a multiplicative inverse of A.

If B and C are both multiplicative inverses of A then

B = BI = B(AC) = (BA)C = IC = C,

thus a matrix can have at most one multiplicative inverse.

Definition: an n \times n matrix is said to be singular if it does not have a multiplicative inverse.

Or similarly, an n \times n matrix A is singular if A \mathbf{x} = \mathbf{0} for some non trivial \mathbf{x} \in \mathbb{R}^n \backslash \{\mathbf{0}\}. For a nonsingular matrix A, \mathbf{x} = \mathbf{0} is the only solution to A \mathbf{x} = \mathbf{0}.

Theorem: if A and B are nonsingular n \times n matrices, then AB is also nonsingular and

(AB)^{-1} = B^{-1} A^{-1}.

??? note "Proof:"

Let $A$ and $B$ be nonsingular $n \times n$ matrices. If we suppose $AB$ is nonsingular and $(AB)^{-1} = B^{-1} A^{-1}$ we have

$$
    (AB)^{-1}AB = (B^{-1} A^{-1})AB = B^{-1} (A^{-1} A) B = B^{-1} B = I, \\
    AB(AB)^{-1} = AB(B^{-1} A^{-1}) = A (B B^{-1}) A^{-1} = A A^{-1} = I.
$$

Theorem: let A be a nonsingular n \times n matrix, the inverse of A given by A^{-1} is nonsingular.

??? note "Proof:"

Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x} \in \mathbb{R}^n$ a vector. Suppose $A^{-1} \mathbf{x} = \mathbf{0}$ then

$$
    \mathbf{x} = I \mathbf{x} = (A A^{-1}) \mathbf{x} = A(A^{-1} \mathbf{x}) = \mathbf{0}.
$$

Theorem: let A be a nonsingular n \times n matrix then the solution of the system A\mathbf{x} = \mathbf{b} is \mathbf{x} = A^{-1} \mathbf{b} with \mathbf{x}, \mathbf{b} \in \mathbb{R}^n.

??? note "Proof:"

Let $A$ be a nonsingular $n \times n$ matrix, $A^{-1}$ its inverse and $\mathbf{x}, \mathbf{b} \in \mathbb{R}^n$ vectors. Suppose $\mathbf{x} = A^{-1} \mathbf{b}$ then we have

$$
    A \mathbf{x} = A (A^{-1} \mathbf{b}) = (A A^{-1}) \mathbf{b} = \mathbf{b}.
$$

Corollary: the system A \mathbf{x} = \mathbf{b} of n linear equations in n unknowns has a unique solution if and only if A is nonsingular.

??? note "Proof:"

The proof follows from the above theorem.

Theorem: let A and B be matrices and \alpha and \beta be scalars. Each of the following statements valid

  1. (A^T)^T = A,
  2. (\alpha A)^T = \alpha A^T,
  3. (A + B)^T = A^T + B^T,
  4. (AB)^T = B^T A^T.

??? note "Proof:"

Will be added later.