Upload
marion-burke
View
221
Download
3
Embed Size (px)
Citation preview
Scientific Computing
General Least Squares
Polynomial Least Squares
• Polynomial Least Squares: We assume that the class of functions is the class of all possible polynomials of degree less than or equal to n.
• We want to minimize
}...)({ 2210
nnxcxcxccxfF
2
0
2210 ])...[(
n
ii
ninii yxcxcxcc
Matrix Formulation of Polynomial Least Squares
• To minimize
• Let
• Then, we want to find the vector c that minimizes the length squared of the error vector Ac-y (or y – Ac) That is, minimize
n
nnnn
n
n
c
c
c
c
c
xxx
xxx
xxx
A
...
,
...1
...............
...1
...1
2
1
0
2
1211
0200
2
0
2210 ])...[(
n
ii
ninii yxcxcxcc
Matrix Formulation of Polynomial Least Squares
• By a similar calculation that we did for the Linear Least Squares we get the Normal Equation for
At Ac = At y The solution c=[c0 c1 ... cn] gives the constants for the
best polynomial fit f* to the data: n
nxcxcxccxf ...)( 2210
*
Matrix Formulation of Polynomial Least Squares
• Definition: The matrix
is called a Vandermonde Matrix. A Vandermonde matrix is a matrix whose columns (or rows) are successive powers of an independent variable.
nnnn
n
n
xxx
xxx
xxx
A
...1
...............
...1
...1
2
1211
0200
Polynomial Least Squares Example
• Problem: Find the best quadratic polynomial fit to the data
• Normal Equation: At Ac = At y where
x -1.0 -0.5 0.0 0.5 1.0
y 1.0 0.5 0.0 0.5 2.0
2
1
0
244
233
222
211
200
,
1
1
1
1
1
c
c
c
c
xx
xx
xx
xx
xx
A
Polynomial Least Squares Example
x -1.0 -0.5 0.0 0.5 1.0
y 1.0 0.5 0.0 0.5 2.0
0.10.11
25.05.01
0.00.01
25.05.01
0.10.11
A
Polynomial Least Squares Example
At y =
Polynomial Least Squares Example
• So, At Ac = At y becomes
• We could use Gaussian Elimination or use Matlab: D = [5 0 2.5; 0 2.5 0; 2.5 0 2.125];
y = [4 1 3.25]‘; c = D \ y
c = 0.0857 0.4000 1.4286
25.3
0.1
0.4
125.20.05.2
0.05.20.0
5.20.00.5
c
Polynomial Least Squares Example
• So, the best degree fit to the data is the polynomial 0.0857 + 0.4 x + 1.4286 x2.
Class Project
• Write a Matlab function that will take a vector of x values and a vector of y-values and will return the vector of coefficients for the best quadratic fit to the data.
Class Project 2
• Exercise 9.15 in Pav: Write a Matlab function that will find the coefficients (a,b) for the function a ex + b e-x that best approximates a set of data {xi , yi }. Your function should have as input the x and y vectors and should output the vector c=(a,b). Use the Normal equation to solve this problem.
• Test your method on the data x = (-1 -0.5 0 0.5 1), y = (1.194, 0.43, 0.103, 0.322, 1.034)• Graph the data and your best fit function.
QR and Least Squares
• Given the normal equation At Ac = At y it is often the case that solving this directly can be unstable numerically.
• For example, consider the matrix Then the matrix
becomes singular if eps is less than the square root of the machine epsilon.
eps
epsA
0
0
11
2
2
11
11
eps
epsAAt
QR and Least Squares
• To resolve this problem, assume we have carried out a QR factorization of A (this can be done for any matrix – square or rectangular).
• Then, At Ac = At y ->
Note: where Rn is an nxn matrix.
Claim: To solve the least squares problem, it is enough to find the solution c to: Rn c = Qt y
yQRRcR
yQRQRcQR
yQRcQRQR
ttt
tttt
tt
)()()(
0nRQQRA
QR and Least Squares
Proof: The least squares solution is to find the minimum value of For we have
QED Note: Once Rn and Q are determined, it is relatively straight-
forward to solve Rn c = Qt y for c. (Back-substitution)
2
2
2
2
2
2
2
2
2
2 000yQcRyQc
RyQc
RQQyc
RQyAc t
ntntntn
2
2yAc
0nRQQRA