Author: haroonkhan
-
WHAT IS STATISTICS?
A rather general answer to this question is that statistics is a group of methods to collect, analyze, present, and interpret data (and possibly to make decisions). We often consider statistics as a branch of mathematics, but this is the result of a more recent tendency. From a historical perspective, the term “statistics” stems from…
-
Introduction
Some fundamental concepts of descriptive statistics, like frequencies, relative frequencies, and histograms, have been introduced informally. Here we want to illustrate and expand those concepts in a slightly more systematic way. Our treatment will be rather brief since, within the framework descriptive statistics is essentially a tool for building some intuition paving. We introduce basic…
-
Integrals in multiple dimensions
Definite integrals have been introduced in Section 2.13 as a way to compute the area below the curve corresponding to the graph of a function of one variable. If we consider a function (x, y) of two variables, there is no reason why we should not consider its surface plot and the volume below the surface, corresponding to a region D on…
-
Partial derivatives: gradient and Hessian matrix
In Section 2.7 we defined the derivative of a function of a single variable as the limit of an increment ratio: If we have a function of several variables, we may readily extend the concept above by considering a point and perturbing one variable at a time. We obtain the concept of a partial derivative with respect to a single…
-
CALCULUS IN MULTIPLE DIMENSIONS
In this section we extend some concepts that we introduced in the previous concerning calculus for functions of one variable. What we really need for what follows is to get an intuitive idea of how some basic concepts are generalized when we consider a function of multiple variables, i.e., a function f(x1, x2, …, xn) = f(x) mapping a…
-
QUADRATIC FORMS
We explore the connections between linear algebra and calculus. This is necessary in order to generalize calculus concepts to functions of several variables; since any interesting management problem involves multiple dimensions, this is a worthy task. The simplest nonlinear function of multiple variables is arguably a quadratic form: Denoting the double sum as is typically preferred to ,…
-
EIGENVALUES AND EIGENVECTORS
In Section 3.4.3 we observed that a square matrix is a way to represent a linear mapping from the space of n-dimensional vectors to itself. Such a transformation, in general, entails both a rotation and a change of vector length. If the matrix is orthogonal, then the mapping is just a rotation. It may happen, for a specific vector v and…
-
Determinant and matrix inversion
From a formal perspective, we may use matrix inversion to solve a system of linear equations: From a practical viewpoint, this is hardly advisable, as Gaussian elimination entails much less work. To see why, observe that one can find each column of the inverse matrix by solving the following system of linear equations: Here, vector ej is a…
-
DETERMINANT
The determinant of a square matrix is a function mapping square matrices into real numbers, and it is an important theoretical tool in linear algebra. Actually, it was investigated before the introduction of the matrix concept. In Section 3.2.3 we have seen that determinants can be used to solve systems of linear equations by Cramer’s rule. Another…
-
Matrix rank
In this section we explore the link between a basis of a linear space and the possibility of finding a unique solution of a system of linear equations Ax = b, where , , and . Here, n is the number of variables and m is the number of equations; in most cases, we have m = n, but we may try to generalize a bit. Recall that…