Upload
liesel
View
82
Download
3
Tags:
Embed Size (px)
DESCRIPTION
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17. Least Square Regression. CURVE FITTING Part 5. Describes techniques to fit curves ( curve fitting ) to discrete data to obtain intermediate estimates. - PowerPoint PPT Presentation
Citation preview
The Islamic University of GazaFaculty of Engineering
Civil Engineering Department
Numerical Analysis ECIV 3306
Chapter 17
Least Square Regression
CURVE FITTINGPart 5
Describes techniques to fit curves (curve fitting) to discrete data to obtain intermediate estimates.
There are two general approaches for curve fitting:• Least Squares regression:
Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data.
• Interpolation: Data is very precise. The strategy is to pass a curve or a
series of curves through each of the points.
Introduction
In engineering, two types of applications are encountered:– Trend analysis. Predicting values of dependent
variable, may include extrapolation beyond data points or interpolation between data points.
– Hypothesis testing. Comparing existing mathematical model with measured data.
Mathematical Background• Arithmetic mean. The sum of the individual data
points (yi) divided by the number of points (n).
• Standard deviation. The most common measure of a spread for a sample.
nin
yy i ,,1,
2)(,1
yySnSS it
ty
Mathematical Background (cont’d)
• Variance. Representation of spread by the square of the standard deviation.
or
• Coefficient of variation. Has the utility to quantify the spread of data.
1
/222
n
nyyS ii
y1)( 2
2
n
yyS i
y
%100..y
Svc y
Least Squares RegressionChapter 17
Linear Regression Fitting a straight line to a set of paired
observations: (x1, y1), (x2, y2),…,(xn, yn).y = a0+ a1 x + ea1 - slopea0 - intercepte - error, or residual, between the model and the observations
Linear Regression: Residual
Linear Regression: Question
How to find a0 and a1 so that the error would be minimum?
Linear Regression: Criteria for a “Best” Fit
n
iii
n
ii xaaye
110
1
)(min
e1e2
e1= -e2
Linear Regression: Criteria for a “Best” Fit
n
iii
n
ii xaaye
110
1
||||min
Linear Regression: Criteria for a “Best” Fit
||||max min 10
n
1i iii xaaye
Linear Regression: Least Squares Fit
n
iii
n
iir xaayeS
1
210
1
2 )(min
n
i
n
iiiii
n
iir xaayyyeS
1 1
210
2
1
2 )()model,measured,(
Yields a unique line for a given set of data.
Linear Regression: Least Squares Fit
n
iii
n
iir xaayeS
1
210
1
2 )(min
The coefficients a0 and a1 that minimize Sr must satisfy the following conditions:
0
0
1
0
aSaS
r
r
210
10
11
1
0
0
0)(2
0)(2
iiii
ii
iioir
ioio
r
xaxaxy
xaay
xxaayaS
xaayaS
Linear Regression: Determination of ao and a1
210
10
00
iiii
ii
xaxaxy
yaxna
naa2 equations with 2 unknowns, can be solved simultaneously
Linear Regression: Determination of ao and a1
221
ii
iiii
xxn
yxyxna
xaya 10
18
19
Error Quantification of Linear Regression
• Total sum of the squares around the mean for the dependent variable, y, is St
• Sum of the squares of residuals around the regression line is Sr
2it yyS )(
2n
1ii1oi
n
1i
2ir xaayeS )(
Error Quantification of Linear Regression
• St-Sr quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value.
t
rt
SSSr
2
r2: coefficient of determination
r : correlation coefficient
Error Quantification of Linear Regression
For a perfect fit:• Sr= 0 and r = r2 =1, signifying that the line
explains 100 percent of the variability of the data.
• For r = r2 = 0, Sr = St, the fit represents no improvement.
Least Squares Fit of a Straight Line: Example
Fit a straight line to the x and y values in the following Table:
5.119 ii yx
28 ix 0.24 iy
1402 ix
428571.3724 4
728
yx
428571.3724 4
728
yx
xi yi xiyi xi2
1 0.5 0.5 12 2.5 5 43 2 6 94 4 16 165 3.5 17.5 256 6 36 367 5.5 38.5 49
28 24 119.5 140
Least Squares Fit of a Straight Line: Example (cont’d)
07142857.048392857.0428571.3
8392857.0281407
24285.1197
)(
10
2
221
xaya
xxn
yxyxna
ii
iiii
Y = 0.07142857 + 0.8392857 x
Least Squares Fit of a Straight Line: Example (Error Analysis)
9911.22 ir eS
932.0868.02 rr
xi yi
1 0.52 2.53 2.04 4.05 3.56 6.07 5.5
8.5765 0.1687 0.8622 0.56252.0408 0.3473 0.3265 0.32650.0051 0.5896 6.6122 0.79724.2908 0.1993
2^
22 )( yy e)y(y iii
28 24.0 22.7143 2.9911
868.02
t
rt
SSSr
7143.222 yyS it
Least Squares Fit of a Straight Line: Example (Error Analysis)
9457.117
7143.221
nSs t
y
7735.027
9911.22/
nSs r
xy
yxy SS /
•The standard deviation (quantifies the spread around the mean):
•The standard error of estimate (quantifies the spread around the regression line)
Because , the linear regression model has good fitness
Algorithm for linear regression
Linearization of Nonlinear Relationships
• The relationship between the dependent and independent variables is linear.
• However, a few types of nonlinear functions can be transformed into linear regression problems.
The exponential equation.The power equation.The saturation-growth-rate equation.
Linearization of Nonlinear Relationships1. The exponential equation.
xbeay 11
xbay 11lnln
y* = ao + a1 x
Linearization of Nonlinear Relationships2. The power equation
22
bxay
xbay logloglog 22
y* = ao + a1 x*
Linearization of Nonlinear Relationships3. The saturation-growth-rate equation
xb
xay3
3
xab
ay111
3
3
3
y* = 1/yao = 1/a3
a1 = b3/a3
x* = 1/x
ExampleFit the following Equation:
22
bxay
to the data in the following table:
xi yi
1 0.52 1.73 3.44 5.75
8.415 19.7
X*=log xi Y*=logyi
0 -0.3010.301 0.2260.477 0.5340.602 0.7530.699 0.9222.079 2.141
)log(log 22
bxay
2120
**
log logloglet
b, aaax, y, X Y
xbay logloglog 22
*10
* XaaY
Example
Xi Yi X*i=Log(X) Y*i=Log(Y) X*Y* X*^2
1 0.5 0.0000 -0.3010 0.0000 0.0000
2 1.7 0.3010 0.2304 0.0694 0.0906
3 3.4 0.4771 0.5315 0.2536 0.2276
4 5.7 0.6021 0.7559 0.4551 0.3625
5 8.4 0.6990 0.9243 0.6460 0.4886
Sum 15 19.700 2.079 2.141 1.424 1.169
1 2 22
0 1
5 1.424 2.079 2.141 1.755 1.169 2.079( )
0.4282 1.75 0.41584 0.334
i i i i
i i
n x y x ya
n x x
a y a x
Linearization of Nonlinear Functions: Example
log y=-0.334+1.75log x
1.750.46y x