oLecture (2 hours) / week - Delta...

Preview:

Citation preview

1

oLecture (2 hours) / week

Saturday, g1: (period 1)

g2: (period 2)

oLab. , Sec (2 hours)/week

Saturday, g1: (period 4)

Wednesday, g2: (period 3)

2

3

This course introduces the principles of instrumentation and measurements.

It explores the working principles of DC & AC meters, oscilloscope and signal generators as well as the operation and application of various sensors and transducers

o Introduce the fundamentals of measurements and instrumentation

o Explain the working principle of DC & AC meters and measurements

o Discuss the operation of oscilloscope and signal generator

o Describe the working principle of various sensors and transducers

o Explain the methodology of signal conditioning and data acquisition

4

o Able to Explain the fundamentals of measurements

and instrumentation Explain the working principle of DC & AC

meters Discuss the operation of oscilloscope and

signal generator Describe the working principle of various

sensors and transducers

5

• Part 1 – Measurements

– DC Measurement

– AC Measurement

– Oscilloscope

– Signal generator

6

• Part 2

– Instrumentation

– Signal conditioning

– Signal transmission

– Sensors

o Northrop R.B., Introduction to Instrumentation & Measurement, 2nd Ed., CRC Press, 2005

o Morris A.S., Measurement & Instrumentation Principle, Butterworth-Heinemann, 2001

o Kalsi H.S., Electronic Instrumentation, 2nd Ed., Tata McGraw-Hill, 2004

7

• Distribution – Final Exam (40)

– Med-term (20)

– Term activity: (40)

Quiz (4) (10)

Laboratory (20)

Attendance, Res.

(Lec. Tut.) (10)

8

Introduction to Instrumentation and

Measurements

9

Process of comparing an unknown quantity with an accepted standard quantity

Estimation of the magnitude of some attribute of an object relative to a unit of measurement

10

Measurement standards

Measurement errors

Accuracy vs. precision

Measurement Uncertainty

11

Based on definition of the seven fundamental SI units of measurement

Categorized into four: International standard (SI)

Primary standards

Secondary (transfer) standards

Working standards

12

13

Quantity Symbol Unit Symbol

Length l meter m

Mass m kilogram kg

Time t second s

Temperature T kelvin oK

Electric current I ampere A

Amount of Substance mole mol

Luminous intensity candela cd

14

Quantity Symbol Unit Unit Abbre.

Voltage (emf) V volt V

Charge Q coulomb C

Resistance R Ohm Ω

Capacitance C farad F

Inductance L henry H

• Above electrical units are derived from standard unit of measure for electric current

15

• Deviation of a reading from the expected value of the measured variable

• Extent of measurement error must be stated with the measurement

• Error in measurement is expressed as absolute error or percentage of error

16

Absolute error (e)

The difference between the expected (Yn) and the measured (Xn) value of a variable

Percentage of error

e = Yn - Xn

Percent error = (100)Yn

Yn - Xn

17

• Divided into four categories:

–Gross Errors

–Systematic Errors

–Random Errors

–Limiting Errors

Generally the fault of the person using the measuring instrument such as incorrect reading, incorrect recording, incorrect use etc

Avoidable and must be identified and minimized if not eliminated

18

Probable causes: Instrument error

Environmental effect

Observational errors

Causes shall be identified and corrected

19

o Generally an accumulation of large numbers of small inherent causes

o Shall be statistically analyzed and reduced

o Prompt for better accuracy and precise instrument

20

Limiting Errors

o Manufacturing limitation to the accuracy of an instrument

o Stated as percentage of full-scale deflection

o Increases as measured value less than full-scale deflection

21

Example:

A 300-V voltmeter is specified to be accurate within ±2% at full scale. Calculate the limiting error when the instrument is used to measure a 120-V source.

The magnitude of the limiting error is

2/100 x 300 = 6V

Therefore, the limiting error at 120 V is

6/120 x 100 = 5%

(reading < full scale, limiting error increased)

22

Accuracy

The degree of exactness of a measurement compared to the expected value

23

A = 1 - Yn

Yn - Xn

Accuracy vs. Precision

• Precision

– A measure of consistency, or repeatability of measurements

Xn - XnPrecision = 1 -Xn

Xn = the value of the nth measurement

nX = the average of the set of n measurements

The expected value of the voltage across a resistor

is 5.0V. However, measurement yields a value of

4.9V. Calculate:

a) absolute error (0.1)

b)% error (2%)

c) relative accuracy (0.98)

d) % accuracy (98%)

24

25

• Probability that a reading falls within the interval that contain true value

• Confidence level for margin of errors

• Statistically determined

• Reflect instrument imprecision

oMean value/ Arithmetic Mean

oDeviation

oAverage deviation (D)

oStandard deviation (S)

26

n

1i

n321 x x x x

n

x

nx i

27

n = total number of piece of data

xn = the value of the nth measurement

xi = set of number

The difference between each piece of data and arithmetic mean

28

xxd nn

* Note

0 21 ntot dddd

precision of a measuring instrument

- high D low precision

- low D high precision

29

n

dddD

n

21

The degree to which the value vary about the average value

30

30nfor

1

1 1

2

1

2

n

d

n

xx

S

n

i

i

n

i

i

30 n for 1

2

n

d

S

n

i

i

For the following data compute

(a) The arithmetic mean (49.9)

(b) The deviation of each value (0.2,-0.2,-0.3,0.3)

(c) The algebraic sum of the deviation (0)

(d) The average deviation (0.25)

(e) The standard deviation (0.294)

31

x1= 50.1

x2= 49.7

x3= 49.6

x4= 50.2

32

• Process of establishing the relation between the indication of a measuring instrument and the value of a measurement standard

• Traceability to International Standard

• Calibration improve accuracy

33

Recommended