Least-Squares Regression
Find and interpret the least-squares regression line (LSRL) and make predictions.
Try the Interactive Version!
Learn step-by-step with practice exercises built right in.
Least-Squares Regression
The Regression Line
The least-squares regression line (LSRL) is the line that minimizes the sum of squared residuals:
where:
- = predicted value of
- = slope
- = y-intercept
Computing the LSRL
Slope:
Y-intercept:
The LSRL always passes through the point .
Interpreting the Slope
"For each additional [unit of x], the predicted [y variable] changes by [b] [units of y]."
Example: If where = study hours and = exam score: "For each additional hour of studying, the predicted exam score increases by 0.8 points."
Interpreting the Y-Intercept
"When [x variable] = 0, the predicted [y variable] is [a] [units]."
Caution: The y-intercept often doesn't have practical meaning (e.g., 0 hours of studying may not make sense in context).
Making Predictions
Substitute the -value into the equation to get .
Extrapolation
Extrapolation = predicting for -values outside the range of the data. This is dangerous because the linear pattern may not continue.
Residuals
- Positive residual: actual > predicted (point above line)
- Negative residual: actual < predicted (point below line)
- for the LSRL
Properties of LSRL
- Minimizes
- Passes through
- Sum of residuals = 0
- and have the same sign
- Regression toward the mean: predicted values are closer to than observed values
AP Tip: When interpreting slope, always use "predicted" — not "will increase by." The relationship is an estimate, not a guarantee.
📚 Practice Problems
No example problems available yet.