# Simple Linear Regression - R-square

- Author:
- Shu Fai Cheung

- Topic:
- Linear Regression, Square

This demonstration illustrates the relation between R-square with the different sums of squares (SS).

The initial positions of the ten points are based on the sample dataset in the handout. There are two lines.
One is a horizontal line at the mean of y. This is a model using only the mean to predict y. The sum of squared residuals from this line is SS

_{Total}. The other is the OLS line, the "best" line with the smallest possible sum of squared residuals. The sum of squared residuals form this line is SS_{Residual}. This SS is equal to or smaller than SS_{Total}. If the OLS line is better than the mean (smaller in SS), then we ask*how much better.*The pink downward arrow on the right represents*how much better*: SS_{Model}*=*SS_{Total}- SS_{Residual}R-square is then defined as the*proportion of reduction**in*SS: SS_{Model}/SS_{Total}*.*Move the data points and see how SS_{Total}, SS_{Residual}, SS_{Model}, and R-square change. You can reset the points to the sample dataset by refreshing this page. You can also use mouse wheel to zoom in and zoom out the graphs