From a formal perspective, we may use matrix inversion to solve a system of linear equations:
From a practical viewpoint, this is hardly advisable, as Gaussian elimination entails much less work. To see why, observe that one can find each column of the inverse matrix by solving the following system of linear equations:
Here, vector ej is a vector whose elements are all zero, with a 1 in position j: ej = [0, 0, … 0, 1, 0, …, 0]T. In other words, we should apply Gaussian elimination n times to find the inverse matrix.
There is another way to compute the inverse of a matrix, based on the cofactors we have defined above.
THEOREM 3.8 Let à be a matrix whose element (i, j) is the (j, i)th cofactor Cji of an invertible matrix A: ãij = Cji. Then
The matrix à is called the adjoint of A. In fact, Cramer’s rule (3.6) is a consequence of this theorem. Still, computing the inverse of a matrix is a painful process, and we may see why inverse matrices are not computed explictly quite often. Nevertheless, the inverse matrix is conceptually relevant, and we should wonder if we may characterize invertible matrices in some useful manner.
THEOREM 3.9 A square matrix is invertible (nonsingular) if and only if det(A) ≠ 0.
This should not come as a surprise, considering that when computing an inverse matrix by the adjoint matrix or when solving a system of linear equations by Cramer’s rule, we divide by det(A).
Example 3.12 Using Theorem 3.8, we may prove the validity of a handy rule to invert a 2 × 2 matrix:
In plain terms, to invert a bidimensional matrix, we must
- Swap the two elements on the diagonal.
- Change the sign of the other two elements.
- Divide by the determinant.
Properties of the determinant
The determinant enjoys a lot of useful properties that we list here without any proof.
- For any square matrix A, det(AT) = det(A).
- If two rows (or columns) of A are equal, then det(A) = 0.
- If matrix A has an all-zero row (or column), then det(A) = 0.
- If we multiply the entries of a row (or column) in matrix A by a scalar α to obtain matrix B, then det(B) = α det(A).
- The determinant of a lower triangular, upper triangular, or diagonal matrix is the product of entries on the diagonal.
- The determinant of the product of two conformable matrices A and B is the product of the determinants: det(AB) = det(A) det(B).
- If A is invertible, then det(A −1) = 1/ det(A).
Leave a Reply