This unit covers linear regression with multiple predictors. Many new ideas emerge when we add more predictors. Importantly, regression coefficients need to be interpreted as average differences with all other predictors fixed. The relationship between predictors therefore plays a very important role, and after this unit you should understand how adding and removing regression coefficients influences the others. We will talk about the joint sample distribution of regression coefficients and collinearity. An emphasis will be placed on different ways of understanding these things (linear algebra vs. geometry) and as always, experimentation with numerical simulations. We will also introduce the idea of categorical predictors, which allow us to handle qualitative data and discuss a common method called analysis of variance as a regression model.
Material:
Regression with multiple predictors, covariance matrix, Simpson’s paradox, collinearity, $F$-tests, catagorical predictors and dummy variables, analysis of variance