Multiple regression models a response as a linear combination of several predictor variables.
The predictor variables, and the response, can be non-linear functions of observations so the
model is very versatile. We show how interactions, quadratic terms, categorical variables,
sinusoidal functions, and past values of the response can be used as predictor variables.
Apart from prediction, regression models can be used as empirical evidence to support, or
suggest, explanations of relationships between predictor variables and the response.
There are many variants on the standard multiple regression model. We show how to fit models
that are not linear in the unknown coefficients using the principle of least squares. We consider
two models for a discrete response, logistic regression and Poisson regression, which are
examples of the generalized linear model. The generalized least squares algorithm is included
as an exercise.
|