LINEAR SPACES

In the previous sections, we introduced vectors and matrices and defined an algebra to work on them. Now we try to gain a deeper understanding by taking a more abstract view, introducing linear spaces. To prepare for that, let us emphasize a few relevant concepts:

  • Linear combinations. We have defined linear combination of vectors, but in fact we are not constrained to take linear combinations of finite-dimensional objects like tuples of numbers. For instance, we might work with infinite-dimensional spaces of functions. To see a concrete example, consider monomial functions like mk(x) ≡ xk. If we take a linear combination of monomials up to degree n, we get a polynomial:imagesWe might represent the polynomial of degree n by the vector of its coefficients:imagesNote that we are basically expressing the polynomial using “coordinates” in a reference system represented by monomials. However, we could express the very same polynomial using different building blocks. For instance, how can we express the polynomial p(x) = 2 − 3x + 5x2 as a linear combination of e1(x) = 1, e2(x) = 1 − x, and e3(x) = 1 + x + x2? We can ask a similar question with vectors. Given the vector v = [1, 2]T, how can we express it as a linear combination of vectors u1 = [1, 0]T and u2 = [1, 1]T? More generally, we have seen that solving a system of linear equations entails expressing the right-hand side as a linear combination of the columns of the matrix coefficients. But when are our building blocks enough to represent anything we want? This leads us to consider concepts such as linear independence and the basis of a linear space. We will do so for simple, finite-dimensional spaces, but the concepts that we introduce are much more general and far-reaching. Indeed, if we consider polynomials of degree only up to n, we are dealing with a finite-dimensional space, but this is not the case when dealing with general functions.16
  • Linear mappings. We have seen that when we multiply a vector images by a square matrix images, we get another vector v ∈ images, and that matrix multiplication can be considered as a mapping f from the space of n-dimensional vectors to itself: v = f(u) = Au. Moreover, this mapping is linear in the sense of Eq. (3.10). Viewing matrix multiplication in this sense helps in viewing the solution of a system of linear equations as a problem of function inversion. If the image of the linear mapping represented by A is the whole space images, then there must be (at least) one vector x such that b = Ax. Put another way, the set of columns of matrix A should be rich enough to express b as a linear combination of them.
  • Linear spaces. Consider the space images of n-dimensional vectors. On that space, we have defined elementary operations such as addition and multiplication by a scalar, which allow us to take linear combinations. If we take an arbitrary combination of vectors in that space, we get another vector in images. By the same token, we see that a linear combination of polynomials of degree not larger than n yields another such polynomial. A linear space is a set equipped with the operations above, which is closed under linear combinations.17

Linear algebra is the study of linear mappings between linear spaces. Before embarking into a study of linear mappings and linear spaces, we consider a motivating example that generalizes the option pricing model of Section 3.1.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *