Chapter 3: The Multiple Linear Regression Model
Regression and Analysis of Variance
By Peter Goos and Ellen Vandervieren
Professor of Bio-Science Engineering
University of Leuven
Associate Professor in Statistics and Mathematics Education
University of Antwerp
A previous chapter of the book demonstrated that the simple linear regression model is very useful. In some cases, however, the simple linear model can be improved upon by considering additional explanatory variables. In other words, it is sometimes possible to explain more of the variation in the response variable by including more than one explanatory variable in the regression model. In this chapter, we show how to generalize the simple linear regression model, involving just one explanatory variable, to a multiple linear regression model, involving more than one explanatory variable.
The focus in this chapter is on the use of multiple quantitative explanatory variables. It is possible to include qualitative explanatory variables in multiple regression models as well. We deal with qualitative (both nominal and ordinal) explanatory variables in Chapter 5.
Learning objectives of this chapter include:
- You understand the goal of multiple linear regression.
- You know how to interpret the parameters in a multiple linear regression model.
- You are familiar with the concepts of main effects, interaction effects and quadratic effects.
- You understand the principle of least squares regression and you can derive the least squares estimator for multiple linear regression.
- You are familiar with the assumptions behind the statistical inference in the context of multiple linear regression. You can derive the properties of the least squares estimator which form the basis for statistical inference.
- You can interpret the elements of a variance-covariance matrix.
- You know how to test the signicance of the parameters in a multiple linear regression model.
- You know how to test hypotheses concerning one or more linear combinations of model parameters.
- You know how to make predictions using a multiple linear regression model.
- You can evaluate the quality of a multiple linear regression model using the (adjusted) coefficient of determination, various information criteria and the global Ftest.
- You are familiar with the techniques to verify whether the assumptions behind the multiple linear regression model hold.
- You understand the mathematical derivations in the technical appendices.
- You are able to calculate the least squares regression model manually for small, well-structured data sets.
- You are able to compute the least squares regression model with the JMP software for any given data set.
- You can interpret the multiple linear regression output produced by JMP, including graphs such as the prediction profiler and the surface profiler.
- You can reconstruct most pieces of the JMP output for multiple linear regression.
- You know how to conduct tailor-made hypothesis tests for one or more linear combinations of model parameters in JMP.
- You can build an analysis of variance (ANOVA) table corresponding to a given multiple linear regression model
- You can interpret residual plots made to check the model assumptions.