Simple Linear Regression - R-square
- Shu Fai Cheung
This demonstration illustrates the relation between R-square with the different sums of squares (SS).
The initial positions of the ten points are based on the sample dataset in the handout. There are two lines. One is a horizontal line at the mean of y. This is a model using only the mean to predict y. The sum of squared residuals from this line is SSTotal. The other is the OLS line, the "best" line with the smallest possible sum of squared residuals. The sum of squared residuals form this line is SSResidual. This SS is equal to or smaller than SSTotal. If the OLS line is better than the mean (smaller in SS), then we ask how much better. The pink downward arrow on the right represents how much better: SSModel = SSTotal - SSResidual R-square is then defined as the proportion of reduction in SS: SSModel/SSTotal. Move the data points and see how SSTotal, SSResidual, SSModel, and R-square change. You can reset the points to the sample dataset by refreshing this page. You can also use mouse wheel to zoom in and zoom out the graphs