Math 50 Linear regression modeling

Unit 4

Linear regression with multiple predictors

This unit covers linear regression with multiple predictors. Many new ideas emerge when we add more predictors. Importantly, regression coefficients need to be interpreted as average differences with all other predictors fixed. The relationship between predictors therefore plays a very important role, and after this unit you should understand how adding and removing regression coefficients influences the others. We will talk about the joint sample distribution of regression coefficients and collinearity. An emphasis will be placed on different ways of understanding these things (linear algebra vs. geometry) and as always, experimentation with numerical simulations. We will also introduce the idea of categorical predictors, which allow us to handle qualitative data and discuss a common method called analysis of variance as a regression model.

Material:

Concepts

Regression with multiple predictors, covariance matrix, Simpson’s paradox, collinearity, $F$-tests, catagorical predictors and dummy variables, analysis of variance

Things to practice

  • Derive a linear system involving the regression coefficients for the covariances (generalizing what I do in class)
  • Interpret the output of a fitted model with multiple predictors (including with qualitative predictors)
  • Predict how adding a predictor to a model will change the regression coefficient of an existing predictor (given knowledge about predictors).
  • Identify symptoms of colinearity
  • Translate ANOVA to regression language

Wikipedia References