Skip to content
Sahithyan's S2
Sahithyan's S2 — Methods of Mathematics

Least Squares Approximation

Sr=i=1nei2=i=1n(yi,measuredyi,model)2S_r = \sum_{i=1}^n e_i^2= \sum_{i=1}^{n} (y_{i,\text{measured}} - y_{i,\text{model}})^2

Linear regression

The least squares approach that involves determining the best approximating line.

Let the best least squares line to a collection of data points (xi,yi)i=1n{(x_i,y_i)_{i=1}^n} be:

y=a1x+a0y = a_1 x + a_0

a1a_1 and a0a_0 can be found by solving:

a1i=1nxi2+a0i=1nxi=i=1nxiyia_1 \sum_{i=1}^n {x_i^2} + a_0\sum_{i=1}^n {x_i} = \sum_{i=1}^n {x_i y_i} a1i=1nxi+a0i=1n1=i=1nyia_1 \sum_{i=1}^n {x_i} + a_0\sum_{i=1}^n {1} = \sum_{i=1}^n {y_i}

Polynomial regression

The least squares approach that involves determining the best approximating polynomial.

Let the best least squares polynomial to a collection of data points (xi,yi)i=1m{(x_i,y_i)_{i=1}^m} be:

Pn(x)=i=1naixiP_n(x) = \sum_{i=1}^{n} a_i x^i

Here n<m1n \lt m-1.

The constants aia_i can be found by subtituting the known data points in the polynomial.

k=0n{aki=1mxij+k}=i=1myixij    where  j=0,1,,n\sum_{k=0}^n \left\{a_k \sum_{i=1}^m x_i^{j+k} \right\} = \sum_{i=1}^m y_i x_i^{j} \;\;\text{where}\; j = 0,1,\dots,n