Evaluating a multiple linear regression equation based on a given technology output
Howdy! I'm Professor Curtis of Aspire Mountain Academy here with more statistics homework help. Today, we're going to learn how to evaluate a multiple linear regression equation based on a given technology output. Here's our problem statement: Consider the correlation between heights of fathers and mothers and the heights of their sons. Refer to the accompanying technology output. Should the multiple regression equation be used for predicting the height of a son based on the height of his father and mother? Why or why not?
OK, the first thing we’re going to do is take a look at this technology output. So I’m going to click on this icon here, and out comes the technology output. It looks very similar to what you would see if you were actually making the model in StatCrunch. The advantage of this is that the model has already been made, so all we have to do is evaluate the output to see if it’s something we want to use or not.
There are two main things you want to look at. The first is the P-value. And when you’re looking at the P-value, don’t look over here at the parameter estimates table. You want to look at the P-value not of an individual parameter but of the model as a whole. And the P-value for the model as a whole is found here in the ANOVA table. So we’ve got a P-value that is practically zero. It’s hard to get a P-value better than that, so the P-value looks absolutely excellent.
The other thing you want to check is the R-squared, or more appropriately, the adjusted R-squared values. And here we look down here at the bottom of our output and see that our adjusted R-squared value (0.3552) is not something that I would consider to be all that grand. However, the adjusted R-squared value is most useful for comparing models between each other. We’ve only got one model that we’re looking at here. So the main thing that we want to focus on is the P-value because we’ve only got one model that we’re evaluating.
So the P-value itself looked pretty good, so let’s go through our answer options to see which of these answer options best matches what we saw with the technology output. We’re definitely — we’re definitely going to do this, so the only answer option that has a particular “Yes” to it is going to be answer option B. But before we select that, let’s go through the other answer options to make sure that we don’t want to select them.
Answer option A says “No, because the P-value for the Intercept is not very low.” Well, we’re not looking at P-values for a particular model parameter. We want the P-value for the model as a whole, so that’s not going to work.
Answer option C says, “No, because the R-squared and adjusted R-squared values are not very high.” And while that’s very true, this answer option doesn’t say anything about the P-value. And so it’s — the P-value here is what you want, especially when you’ve just one model. So this isn’t going to work for us.
Answer option D says, “No, because the P-value for Father is smaller than the P-value for Mother.” Again, these are P-values for individual parameters of the model, and the only one we want to look at is the one for the model as a whole. So this isn’t going to work for us. The answer option we do want is answer option B. Excellent!
And that's how we do it at Aspire Mountain Academy. Feel free to leave your comments below and let us know how good a job we did or how we can improve. And if your stats teacher is just boring or doesn't want to help you learn stats, then go to aspiremountainacademy.com, where you can learn more about accessing our lecture videos or provide feedback on what you’d like to see. Thanks for watching! We’ll see you in the next video!
Leave a Reply.
Frustrated with a particular MyStatLab/MyMathLab homework problem? No worries! I'm Professor Curtis, and I'm here to help.