Category: Advanced Regression Models
-
Polynomial regression
A good starting point is polynomial regression. When facing a clearly nonlinear data pattern, like the one in Fig. 16.3(a), we may try to come up with a suitable approximation of the nonlinear function relating data. In principle, polynomials provide us with an arbitrary degree of flexibility.9 Let us take a closer look at a model of polynomial…
-
A GLANCE AT NONLINEAR REGRESSION
Logistic regression introduces a nonlinear transformation to account for the qualitative nature of the response variable. But even when considering a quantitative response, we may be forced to consider nonlinearity. Figure 16.3 shows two examples. Given these introductory examples, we should be convinced that sometimes a nonlinear regression model is warranted. Unfortunately, this may be a sort…
-
A digression: logit and probit choice models
The concepts behind logistic regression and the logit function have also been proposed as a tool to model brand choice in marketing applications. Since choice models are a good way to see integrated use of decision and statistical models, we outline the approach in this section. Consider an individual who chooses between two brands. Ideally,…
-
LOGISTIC REGRESSION
Consider the following questions: All of these questions could be addressed by building a statistical model, possibly a regression model, but they have a troubling feature in common: The response variable is either 0 or 1, where we interpret 0 as “it did not happen” and 1 as “it happened.” So far, we have considered…
-
Using regression for forecasting and explanation purposes
We have seen that omitting variables may result in biased estimates, or even in debatable models where significant coefficients are associated with regressor variables that may even have no real impact on the response variable. However, a rather cynical point of view could be that, as long as the model does a good job at…
-
Testing a multiple regression model
To investigate the statistical validity of a multiple regression model, the first step is to check the variance of the estimators. In this case, we have multiple estimators, so we should check their covariance matrix: Using Eq. (16.4), we see that The familiar assumptions about errors, in the multivariate case, can be expressed as i.e., the…
-
Selecting explanatory variables: collinearity
When selecting variables, there are a few issues and tradeoffs involved. Apparently, we should aim at finding the model with the largest R2 coefficient, and, arguably, the more variables we include, the better model we obtain. However, the following examples show that subtle difficulties may be encountered. Example 16.1 (Omitted variables and bias) Let us consider the sample…
-
BUILDING, TESTING, AND USING MULTIPLE LINEAR REGRESSION MODELS
The least-squares approach to estimating parameters of a multiple regression model is a fairly straightforward extension of simple linear regression. What is not so easy is the extension of the statistical testing procedures, which present more variants when multiple variables are involved. Nevertheless, the necessary intuition for understanding what commercially available statistical software tools offer…
-
MULTIPLE LINEAR REGRESSION BY LEAST SQUARES
Running a linear regression with multiple explanatory variables is a rather straightforward extension of what especially if we assume fixed, deterministic values of the regressors. The underlying statistical model is We avoid using α to denote a constant term, so that we may group parameters into a vector . The model is estimated on the basis…
-
Introduction
We extend the simple linear regression concepts that were introduced. The first quite natural idea is building a linear regression model involving more than one regressor. Finding the parameters by ordinary least squares (OLS) is a rather straightforward exercise, as we see in Section 16.1. What is much less straightforward is the statistical side of the…