A matrix is a -by- tuple of elements for and . The matrix has rows and columns, represented as

Note that by this definition, a ๐Ÿน Vector can be viewed as a matrix where one dimension is ; a column vector is a matrix, and a row vector is a matrix.

Generalizing to more than two-dimensions, we get tensors. Mathematically, a tensor is Operations on tensors are usually less defined, but in many cases, we perform matrix operations on two dimensions of the tensor while ignoring the rest.

Identity Matrix

The identity matrix is a square matrix that has along the diagonals and everywhere else. It preserves the equation for some .

Addition and Multiplication

Matrices are added element-by-element. For ,

In matrix multiplication, , is the sum of products between the th row of and th column of . Specifically,

Note that this also means that .

Note

Matrix shapes must be the same for addition, and for multiplication, the inner sides must be equal: with gives a product.

The Hadamard product, often denoted as , performs multiplication element-wise, similar to addition. That is,

where and are required to have the same shape.

Inverse and Transpose

The inverse of square matrix is the matrix satisfying . Not all matrices have an inverse, but if they do, the inverse is unique.

The transpose of a matrix is the matrix thatโ€™s flipped across the diagonal. Specifically if , we have . Also, if , we call a symmetric matrix.

For matrix products, and .

We can compute the inverse using row operations from Gaussian Elimination. Simplifying

gives us the inverse. This immediately gives us the solution to the system, .

Moore-Penrose Pseudoinverse

If is not square but has linearly independent columns, we can use the Moore-Penrose pseudo-inverse

If doesnโ€™t have linearly independent columns, we add a small multiple of to ,

which can be solved with ๐Ÿ“Ž Singular Value Decomposition,

where is calculated from by taking the reciprocal of its diagonal and then taking the transpose.

Special Matrices

Symmetric Positive Semidefinite (PSD)

A matrix is symmetric positive semidefinite if for vector space , ,

If the inequality is strict, the matrix is symmetric positive definite. This quality has a close connection with the ๐ŸŽณ Inner Product.

Every matrix can form a PSD matrix .

Orthogonal

Square matrix is orthogonal if and only if its columns are orthonormal, as defined by the inner product between columns. For orthogonal matrix ,

which implies .

Moreover, transformations by orthogonal matrices preserve lengths,

and angles,

Phylogeny

The following covers a wider array of matrix types, arranged from top-to-bottom in a hierarchy.