PHYSICS 2CL – SPRING 2009 Physics Laboratory: Electricity and Magnetism, Waves and Optics

Preview:

DESCRIPTION

PHYSICS 2CL – SPRING 2009 Physics Laboratory: Electricity and Magnetism, Waves and Optics. Prof. Leonid Butov (for Prof. Oleg Shpyrko) oshpyrko@physics.ucsd.edu Mayer Hall Addition (MHA) 3681, ext. 4-3066 Office Hours: Mondays, 3PM-4PM. - PowerPoint PPT Presentation

Citation preview

PHYSICS 2CL – SPRING 2009 Physics Laboratory: Electricity and

Magnetism, Waves and Optics

Prof. Leonid Butov

(for Prof. Oleg Shpyrko)oshpyrko@physics.ucsd.edu

Mayer Hall Addition (MHA) 3681, ext. 4-3066Office Hours: Mondays, 3PM-4PM. Lecture: Mondays, 2:00 p.m. – 2:50 p.m., York Hall 2722Course materials via webct.ucsd.edu(including these lecture slides, manual, schedules etc.)

Today’s Plan:

Chi-Squared, least-squared fitting

Next week: Review Lecture (Prof. Shpyrko is back)

Long-term course schedule

Schedule available on WebCT

Week Lecture Topic Experiment

1 Mar.

30

  Introduction NO LABS

2 Apr. 6   Error propagation;

Oscilloscope;

RC circuits

0

3 Apr. 13  Normal distribution; RLC

circuits 1

4 Apr. 20  Statistical analysis, t-values; 2

5 Apr. 27  Resonant circuits 3

6 May 4   Review of Expts. 4, 5, 6 and 7 4, 5, 6 or 7

7 May 11 Least squares fitting, 2 test 4, 5, 6 or 7

8 May 18  Review Lecture 4, 5, 6 or 7

9 May 25 No Lecture (UCSD Holiday: Memorial Day)

No LABS, Formal Reports Due

10 June 1 Final Exam NO LABS

Labs Done This Quarter

0. Using lab hardware & software1. Analog Electronic Circuits

(resistors/capacitors)2. Oscillations and Resonant Circuits (1/2)3. Resonant circuits (2/2)4. Refraction & Interference with

Microwaves5. Magnetic Fields6. LASER diffraction and interference7. Lenses and the human eye

This week’s lab(s), 3 out of 4

LEAST SQUARES FITTING (Ch.8)Purpose:

1) Agreement with theory?

2) Parameters

0 5 10 15 20 25x

0

10

20

30

y =

f(x

)

y(x) = Bx

LINEAR FIT y(x) = A +Bx :

A – intercept with y axisB – slope

0 5 10 15 20 25x

0

10

20

30

y(x)

    

  

x1 y1

x2 y2

x3 y3

x4 y4

x5 y5

x6 y6A

where B=tan

?LINEAR FIT y(x) = A +Bx

0 5 10 15 20 25x

0

10

20

30

y(x)

     x1 y1

x2 y2

x3 y3

x4 y4

x5 y5

x6 y6

y=-2+2x

y=9+0.8x

y(x) = A +Bx

0 5 10 15 20 25x

0

10

20

30

y(x)

    

y=-2+2x

y=9+0.8xAssumptions:

1) xj << yj ; xj = 0

2) yj – normally distributed

3) j: same for all yj

x1 y1

x2 y2

x3 y3

x4 y4

x5 y5

x6 y6

LINEAR FIT

LINEAR FIT: y(x) = A + Bx

0 5 10 15 20 25x

0

10

20

30

y(x) y3-yfit3

y4-yfit4

Yfit(x

)

[yj-yfitj] 2Qualityof the fit

Method of linear regression, aka the least-squares fit….

LINEAR FIT: y(x) = A + Bx

0 5 10 15 20 25x

0

10

20

30

y(x) y3-(A+Bx3)

y4-(A+Bx4)

true va

lue

[yj-(A+Bxj)] 2minimize

Method of linear regression, aka the least-squares fit….

What about error bars?Not all data points are created equal!

0 5 10 15 20 25x

0

10

20

30

y(x)

Weight-adjusted average:

N

xxx

N

xx Ni

...21

N

NN

i

ii

www

xwxwxw

w

xwx

...

...

21

2211

Reminder:Typically the averagevalue of x is given as:

Sometimes we want to weigh data points with some “weight factors” w1, w2 etc:

You already KNOW this – e. g. your grade:

%205*%12%20

%20%12%20

FINALLABSGRADE

Formal

Weights: 20 for Final Exam, 20 for Formal Report, and 12 for each of 5 labs – lowest score gets dropped)

More precise data points should carry more weight!Idea: weigh the points with the ~ inverse of their error bar

0 5 10 15 20 25x

0

10

20

30

y(x)

Weight-adjusted average:How do we average values with different uncertainties?

Student A measured resistance 100±1 (x1=100 , 1=1 )Student B measured resistance 105±5 (x2=105 , =5 )

21

2211

ww

xwxwx

21

1

1

w

22

2

1

w

N

NN

i

ii

www

xwxwxw

w

xwx

...

...

21

2211

Or in this case calculate for i=1, 2:

with “statistical” weights:

BOTTOM LINE: More precise measurements get weighed more heavily!

0 5 10 15 20 25x

0

10

20

30

y(x)

How good is the agreementbetween theory and data?

TEST for FIT (Ch.12)

) )

N

j j

jj xfy

12

2

2

0 5 10 15 20 25x

0

10

20

30

y(x)

TEST for FIT (Ch.12)

NN

y

y 2

2

d

22~

d = N - c

# of degrees of freedom

# of datapoints # of parameters

calculated from data

# of constraints

1

) )

N

j j

jj xfy

12

2

2

0 5 10 15 20 25x

0

10

20

30

y(x) y3-(A+Bx3)

y4-(A+Bx4)

true v

alueLEAST SQUARES FITTING

1.

2. Minimize 02

A

0

2

B

3. A in terms of xj yj ; B in terms of xj yj , …

4. Calculate 5. Calculated

20~

6. Determine probability for20

2 ~~

xj yj y=f(x)

y(x)=A+Bx+Cx2+exp(-Dx)+ln(Ex)+…

) )

N

j j

jj xfy

12

2

2

Usually computer program (for example Origin) can minimize as a function of fitting parameters (multi-dimensional landscape)by method of steepest descent.

Think about rolling a bowling ball in some energy landscape until it settles at the lowest point

22

Fitting Parameter Space

Best fit (lowest 2)

Sometimes the fitgets “stuck” in local minima like this one.

Solution? Give it a “kick” by resetting one of the fitting parameters and trying again

Example: fitting datapoints to y=A*cos(Bx)

“Perfect” Fit

Example: fitting datapoints to y=A*cos(Bx)

“Stuck” in localminima of 2landscape fit

Next on PHYS 2CL:

Monday, May 18,  Review Lecture

Recommended