1
0
Fork 0
mathematics-physics-wiki/docs/en/mathematics/linear-algebra/matrices/matrix-arithmetic.md

3.9 KiB

Matrix arithmetic

Definitions

Definition: let A be a m \times n matrix given by

A = \begin{pmatrix} a_{11} & a_{12}& \cdots & a_{1n} \ a_{21} & a_{22} & \cdots & a_{2n} \ \vdots & \vdots & \ddots & \vdots \ a_{m1} & a_{m2} & \cdots & a_{mn} \end{pmatrix}

with a_{ij} referred to as the entries of A or scalars in general, with (i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}. For real entries in A we may denote A \in \mathbb{R}^{m \times n}.

This matrix may be denoted in a shorter way by A = (a_{ij}).

Definition: let \mathbf{x} be a 1 \times n matrix, referred to as row vector given by

\mathbf{x} = \begin{pmatrix}x_1 \ x_2 \ \vdots \ x_n\end{pmatrix}

with x_i referred to as the entries of \mathbf{x}, with i \in \{1, \dots, n\}. For real entries we may denote \mathbf{x} \in \mathbb{R}^n.


Definition: let \mathbf{x} be a n \times 1 matrix, referred to as column vector given by

\mathbf{x} = (x_1, x_2, \dots, x_n)

with x_i referred to as the entries of \mathbf{x}, with i \in \{1, \dots, n\}. Also for the column vector we have for real entries \mathbf{x} \in \mathbb{R}^n.

From these two definitions it may be observed that row and column vectors may be used interchangebly, however using both it is important to state the difference. Best practice is to always work with row vectors and take the transpose if necessary.

Matrix operations

Definition: two m \times n matrices A and B are said to be equal if a_{ij} = b_{ij} for each i(i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}.


Definition: if A is an m \times n matrix and \alpha is a scalar, then \alpha A is the m \times n matrix whose (i,j) \in \{1, \dots, m\} \times \{1, \dots, n\} entry is \alpha a_{ij}.


Definition: if A = (a_{ij}) and B = (b_{ij}) are both m \times n matrices, then the sum A + B is the m \times n matrix whose (i,j) \in \{1, \dots, m\} \times \{1, \dots, n\} entry is a_{ij} + b_{ij} for each ordered pair (i,j).

If A is an m \times n matrix and \mathbf{x} is a vector in \mathbb{R}^n, then

A \mathbf{x} = x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_n

with A = (\mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n}).

Definition: if \mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n} are vectors in \mathbb{R}^m and x_1, x_2 \dots, x_n are scalars, then a sum of the form

x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_n

is said to be a linear combination of the vectors \mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n}.


Theorem: a linear system A \mathbf{x} = \mathbf{b} is consistent if and only if \mathbf{b} can be written as a linear combination of the column vectors A.

??? note "Proof:"

Will be added later.

Transpose matrix

Definition: the transpose of an m \times n matrix A is the n \times m matrix B defined by

b_{ji} = a_{ij},

for j \in \{1, \dots, n\} and i \in \{1, \dots m\}. The transpose of A is denoted by A^T.


Definition: an n \times n matrix A is said to be symmetric if A^T = A.

Hermitian matrix

Definition: the conjugate transpose of an m \times n matrix A is the n \times m matrix B defined by

b_{ji} = \bar a_{ij},

for j \in \{1, \dots, n\} and i \in \{1, \dots m\}. The conjugate transpose of A is denoted by A^H.


Definition: an n \times n matrix A is said to be Hermitian if A^H = A.

Matrix multiplication

Definition: if A = (a_{ij}) is an m \times n matrix and B = (b_{ij}) is an n \times r matrix, then the product A B = C = (c_{ij}) is the m \times r matrix whose entries are defined by

c_{ij} = \mathbf{a}i \mathbf{b}j = \sum{k=1}^n a{ik} b_{kj}