Consider a matrix . When we multiply a vector
by this matrix, we get a vector
. This suggests that a matrix is more than just an arrangement of numbers, but it can be regarded as an operator mapping
to
:

Given the rules of matrix algebra, it is easy to see that this mapping is linear, in the sense that:

This means that the mapping of a linear combination is just a linear combination of the mappings. Many transformations of real-world entities are (at least approximately) linear.
If we restrict our attention to a square matrix , we see that this matrix corresponds to some mapping from
to itself; in other words, it is a way to transform a vector. What is the effect of an operator transforming a vector with n components into another vector in the same space? There are two possible effects:
- The length (norm) of the vector is changed.
- The vector is rotated.
In general, a transformation will have both effects, but there may be more specific cases.
If, for some vector v, the matrix A has only the first effect, thus means that, for some scalar , we have

A trivial case in which this happens is when A = λI, i.e., the matrix is a diagonal of numbers equal to λ:

Actually, this is not that interesting, but we will see in Section 3.7 that the condition above may apply for specific scalars (called eigenvalues) and specific vectors (called eigenvectors).
If a matrix has the only effect of rotating a vector, then it does not change the norm of the vector:

This happens if the matrix A has an important property.
DEFINITION 3.2 (Orthogonal matrix) A square matrix is called an orthogonal matrix if PTP = I. Note that this property also implies that P−1 = PT.
To understand the definition, consider each column vector pj of matrix P. The element in row i, column j of PTP is just the inner product of pi and pj. Hence, the definition above is equivalent to the following requirement:

In other words, the columns of P is a set of orthogonal vectors. To be more precise, we should say orthonormal, as the inner product of a column with itself is 1, but we will not be that rigorous. Now it is not difficult to see that an orthogonal matrix is a rotation matrix. To see why, let us check the norm of the transformed vector y = Pv:

where we have used Property 3.1 for transposition of the product of matrices. Rotation matrices are important in multivariate statistical techniques such as principal component analysis and factor analysis.
Leave a Reply