I explain here how multiple linear regression differs from what we learned in the previous chapter. It’s easy stuff, but it’s also important to make sure you understand exactly how the variables in this chapter differ from those in simple linear regression (there are many independent variables instead of just one!)
778 learners taking this course
Most exam questions from this chapter will include a computer output of the solution, but very often there will be some values missing. I'll show you here how to pay "Fill in the blanks" multiple-regression style. This is probably the most common exam question from this chapter.
In a multiple regression model we have more than one independent variable, and so there is more than one slope. Should we test each one individually? Watch this lesson where I explain why the F-test needs to be introduced to replace multiple t-tests.
The sample regression equation is determined here using the output provided in the question.
In part (b) the ask us is the model valid? …and why? To answer this, we make use of the ANOVA table that’s provided in output. This is the F-test of validity. I go through all of the steps required for full marks in a hypothesis test including writing the hypotheses, looking up the rejection region on the F-tables, and using the test statistics to decide whether or not to reject the null.
In many exam questions, you won’t be asked directly for the coefficient of determination. You’ll need to rely on your understanding of what it measures to know that it’s what you’re being asked to report. I discuss here what the coefficient of determination - and the adjusted coefficient of determination mean in a multiple regression context.
Even though the F-test replaces performing multiple t-tests to determine whehter or not a model is valid, the t-tests still have an important function. They are used to tell us WHICH of the independent variables are related to y.
Interpreting the coefficients (intercept, slope 1, slope 2, etc) requires a one simple change compared with the interpretations made in simple linear programming.
Individual tests of slope are used on three independent variables to see which are ‘important’ to the model.
Regression equations are built with one purpose: making predictions! First, we build a regression equation from the coefficients listed in the output. Next, the question setup gives us values that we can plug into our equation in order to make a prediction.
Multicollinearity is new in this chapter. It’s a violation of one of the basic assumptions made when performing linear regression – that the independent variables are truly independent of each other. There are many ways to check for multicollinearity, the most likely method to appear on an exam is shown here: the correlation matrix.
Test your understanding of Chapter 17: Multiple Regression. Each question is accompanied by a mini-video lecture showing you how I decided which solution was the correct one.