Least squares fitting is a mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the offsets (“the residuals”) of the points from the curve. Least squares means that the overall solution minimizes the sum of the squares of the errors made in the results of every single equation.

The function least.squares() in the animation package has provided an illustration of least squares in univariate linear regression.

let’s assume that the relationship between x andy is a linear one.

Let be the eauation of the best fit line to the data. We need to determine the valuses of both the slop and intercept . Assuming each data point carries eauql weight, each point has exactly the same actual error asscociated with it, we find and by minimizing the sum of the squares of the deviations of the actual values of from the line’s calcualted value of y.

The formulas for and are:

Here is the simple demonstration of the meaning of least squares in univariate linear regression.

With slope changing

library(animation)
ani.options(interval = 0.3)
par(mar = c(4, 4, 0.5, 0.1), mgp = c(2, 0.5, 0), tcl = -0.3)
## slope changing
least.squares()

With intercept changing

library(animation)
ani.options(interval = 0.3)
par(mar = c(4, 4, 0.5, 0.1), mgp = c(2, 0.5, 0), tcl = -0.3)
# slope changing
least.squares(ani.type = "intercept")


Published

08 May 2013

Tags