79
1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

Embed Size (px)

Citation preview

Page 1: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

1

SCREENING TESTS

Dr. Khanchit Limpakarnjanarat

Thailand MOPH – US CDC Collaboration (TUC)

Page 2: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

2

SCREENING TESTS

• Settings: ANC, health check up, patient with fever, surveillance, other• Primary prevention may be the best approach to prevent disease occurrence and/or epidemics.• Two possible approaches to early diagnosis

–Depends on awareness of warning signs–Active detection of disease in asymptomatic cases

Page 3: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

3

Know the Seven Warning Signs of Cancer…

•Appearance of a lump in a breast or elsewhere

•A change in a mole or wart

•A sore that doesn't heal

•Indigestion or difficulty swallowing

•Nagging cough or hoarseness

•Unusual bleeding

•Persistent respiratory problems

[American Cancer Society]

Page 4: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

4

CAGE screening for alcoholism

• Have you ever felt you should Cut down on your drinking?

• Have people Annoyed you by criticizing your drinking?

• Have you ever felt bad or Guilty about your drinking?

• Have you ever had a drink first thing in the morning to steady your nerves or to get rid of a hangover (Eye-opener)?

Page 5: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

5

The CAGE questions for alcohol abuse

Positive answers to the 4 CAGE questions

Alcohol abuse

YES NO

3 or 4

2, 1 or

0

ac d

b

a + c

b + d

c + d

a +b

a + b + c + d

(True +)

60

(False +)

1

400

(True -)

57

(False -)

117 401 518

457

61

Suckett D. A Primer on the Precision and Accuracy of the Clinical Examination. JAMA 267(19):2638-2644, May 1992

Sens. = 60/117=0.51Spec. = 400/401=0.998

PVP = 60/61 = 0.98PVN = 400/457 = 0.88

Page 6: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

6

Schema relating path of detection to outcome[Dr. Maureen Handerson]

A. CURRENT SITUATION

Self referral Care for chromic

disease

Diagnosis

Surveillance Recovery

B. FUTURE PROJECTION

Self referral Care for chromic

disease

Diagnosis

Surveillance Recovery

Source: Mausner & Bahn: Epidemiology-an introductory text,

Chapter 9 - screening in detection of disease

Page 7: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

7

Screening test

The basic tool of a screening program and must be thoroughly understood since screening is designed to be applied to large group of people, screening test should be easy to use, rapid and inexpensive. They should also be able to carried out largely by technicians

1.2

Page 8: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

8

Screening Test

Definition: PRESUMPTIVE identification of

unrecognized disease or defect by the application of tests, examinations, or other procedures which can be applied rapidly to sort out apparently well persons who probably have a disease from those who probably do not.

A screening test is not intended to be diagnostic. Persons with positive or suspicious findings must be referred to their physicians for diagnosis and necessary treatment.

[Commission on Chronic Illness, 1951]

Page 9: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

9

Goal of Screening Test

• To reduce morbidity or mortality from the disease among the people screened by early treatment of the cases discovered. (Clinical Medicine)

• To help guide preventive and control measures in general or specific populations. (Epidemiology and Public Health)

Page 10: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

10Flow diagram for a mass screening test

APPARENTLY WELL POPULATION TO BE TESTED

(Well persons plus those with undiagnosed disease)

+

Negatives on test

Positives on test, no diseasePositives on test, disease present

+

++

+

Rescreen at prescribed interval

Negatives (normal)

- persons presumed to be free of disease under study

SCREENING TEST

++++

+

+++++

++

+

++DIAGNOSTIC

PROCEDURES

Positives (abnormal)

- persons presumed to have the disease or be at increased risk in future

THERAPEUTIC INTERVENTION

Disease or risk factor present

Disease or risk factor absent

Rescreen at prescribed interval

Page 11: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

11

PURPOSES OF SCREENING

• DIAGNOSIS• IDENTIFY TOXIC CHEMICAL AGENTS• ESTIMATE MAGNITUDE OF DISEASE OR

PUBLIC HEALTH CONDITIONS• IDENTIFICATION OF PEOPLE AT HIGH

RISK

Page 12: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

12

PURPOSE OF SCREENING (1)

• DIAGNOSIS: Series of tests performed on a symptomatic patient for whom a diagnosis has not yet been established.

• Example: Patients with hematuria may need UA, urine culture, cystoscopy, bladder biopsy, several types of X-rays, several blood chemistry studies

Page 13: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

13

PURPOSE OF SCREENING (2)

• IDENTIFY TOXIC CHEMICAL AGENTS: Chemical agents may be screened by means of laboratory tests or epidemiologic surveillance in order to identify those substances likely to be toxic.

• Example: Pb poisoning surveillance by Pb screening among children

Page 14: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

14

• ESTIMATE MAGNITUDE OF DISEASE OR PUBLIC HEALTH CONDITIONS: Some screening procedures can be used to estimate the prevalence of various conditions which may lead to disease control objectives. Major methodologic problem in this area is the relationship between ‘detected’ prevalence and the underlying ‘true’ prevalence, e.g., sample size, sampling technics.

• Example: Serologic testing, GenProbe testing, Cervical Pap smear, Tuberculin skin test, CXR

PURPOSE OF SCREENING (3)

Page 15: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

15

PURPOSE OF SCREENING (4)

• IDENTIFICATION OF PEOPLE AT HIGH RISK: People at high risk may be who do not yet have the disease. The link between screening for a risk factor and a disease may not be sharp.

• Example: Identify smokers, identify drinkers by MAST test, HT may be a risk factor for CVD or may be early disease detection itself

Page 16: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

16

Types of Screening Programs

• Selective screening specific people at risk for disease

• Mass screening test large number of people

Page 17: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

17

Selective screening

• Selective screening: Tests are used to detect a specific disease among people who are at risk of having disease.– Single disease: e.g., CXR for

pneumoconiosis in coal miners or FBS for evidence of DM in diabetic patients’ relatives

– Multiphasic screening program: e.g., ANC in pregnant women

Page 18: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

18

Mass screening

• Mass screening: Large number of people are tested for the presence of disease or condition without specific emphasis to their individual risk of having disease or condition

– Single disease: e.g., cervical pathology for cancer of cervix, mammography for breast cancer

– Multiphasic screening program: e.g., Biochemical profile in community survey

Page 19: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

19

Lead time and screening test

• Lead time is the time interval from detection by screening test to the time at which diagnosis would have been made without that screening.

• Length of lead time interval may vary from person to person (short and long lead time)

• Importance of lead time is for disease control and by early detection and early treatment to prevent spread and disability of affected persons. Screening test is valuable in reducing severe morbidity and mortality

Page 20: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

20

Measurements used in screening tests

• Validity – test is able to differentiate presence or absence of disease

• Yield – brought unrecognized disease to diagnosis

• Reliability – consistent results when tested more than once

Page 21: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

21

Creation of 2 x 2 table:initial step for calculation

TEST “Screening”

+

-

Yes No

DISEASE “Gold standard”

a bc d

True pos False pos

False neg True neg

Page 22: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

22

VALIDITY

• Validity is the rate at which a test is capable of differentiating the presence or absence of a disease concerned

SENSITIVITY = ability of test to detect people who actually have the disease (True Positives/All Positives)

SPECIFICITY = ability of test to identify correctly people who actually do not have the disease (True Negatives/All Negatives)

Page 23: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

23

1. Sensitivity = proportion of subjects with disease who have the positive test from screening

= a / a+c

or = TP / TP + FN2. Specificity = proportion of subjects without

disease who have the negative test from screening

= d / b+d

or = TN / FP + TN3. Accuracy of the test = a + d / a + b + c + d

= TP + TN / Total screened

Validity of screening test

Page 24: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

24

YIELD

• Yield is the amount of previously unrecognized disease which is diagnosed and brought to treatment as a result of the screening

PREDICTIVE VALUE POSITIVE (PVP) is the likelihood that an individual with a positive test has the disease

PREDICTIVE VALUE NEGATIVE (PVN) is the likelihood that an individual with a negative test does not have the disease

Page 25: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

25

Yield of screening test

• Predictive Value Positive (PVP)PVP = a / a + b

or = TP / TP + FP

• Predictive Value Negative (PVN)PVN = d / c + d

or = TN / TN + FN

9

Page 26: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

26

RELIABILITY (Precision)

• Reliability is consistency of results when the test is performed more than once on the same individual under the same conditions. It is also called ‘Repeatability’

Page 27: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

27

Reliability**Precision**Repeatability

Number of agreed positive= -------------------------------------

Number of positive either time

a= --------------

(a + b + c)

Page 28: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

28

Trade off pointCut off point

“Criterion of Positivity”

Page 29: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

29

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5Persons without disease

Persons with Disease

Trade off point

abc

d

Num

ber

o f p

ers o

n sReal situation of screening test

Persons with disease = a + c

Persons without disease = b + d

Page 30: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

30

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

HealthySick

Trade off point

a

d

Num

ber

o f p

ers o

n sHypothetical best screening test

Page 31: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

31

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

Trade off point A

Healthy

Sick

Num

ber

of p

ers o

n sShifting of trade off point A

abc

d

Page 32: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

32

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

HIV-free population

HIV-positive population

Num

ber

o f p

ers o

n sSetting of trade off point A on sensitivity and specificity of HIV EIA assay

c

A B

Negative test Positive test

FALSE POS

High SEN / Low SPEC

Page 33: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

33

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

Trade off point B

Healthy

Sick

Num

ber

of p

ers o

n sShifting of trade off point B

ab

c

d

X

Page 34: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

34

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

HIV-free population

HIV-positive population

Num

ber

o f p

ers o

n sSetting of trade off point B on sensitivity and specificity of HIV EIA assay

A B

Negative test Positive test

FALSE NEG

Low SEN / High SPEC

Page 35: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

35

Correlation of SCREENING TEST VS.

True results = True positive (a)= True negative (d)

False results = False positive (b)= False negative (c)

GOLD STANDARD

A B

TEST “Screening”

+

-

Yes No

DISEASE “Gold standard”

a bc d

True pos False pos

False neg True neg

Page 36: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

36

1. Sensitivity = proportion of subjects with disease who have the positive test from screening

= a / a+c

or = TP / TP + FN

2. Specificity = proportion of subjects without disease who have the negative test from screening

= d / b+d

or = TN / FP + TN

Validity of screening test

Page 37: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

37

Specificity should be increased relative to sensitivity:

• When the false positive result can harm patient physically, emotionally, or financially, e.g., HIV infection

• When the cost or risk associated with further diagnostic techniques are substantial, such as breast cancer, for which the definitive diagnostic evaluation of a positive screening test is a biopsy

Page 38: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

38

Sensitivity should be increased at the expense of specificity:

• When the penalty associated with missing a case is high such as disease is serious and definitive treatment exist, e.g., PKU

• When the disease can spread, e.g., syphilis

• When subsequent diagnostic evaluations of positive screening tests are associated with minimal cost and risk, e.g., series of B.P. readings to ascertain HT

Page 39: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

39

PROBLEMS WITH SCREENING TESTS

1. Lack of information on negative tests : Prostate specific antigen (PSA) for

prostate cancer

2. Lack of information in the non-disease :MRI to diagnose prolapse disk

3. Lack of standards for disease• Consequence of imperfect standards : diagnosis of gall stone by U/S vs. Cholecystogram

Clinical Epidemiology - KKU

Page 40: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

40

Table : sensitivity and specificity of blood sugar to diagnose DM [Public Health Service, US, 1960]

708090

100110120130140150160170180190200

98.697.194.388.685.771.464.357.150.047.142.938.634.327.1

8.825.547.669.884.192.596.999.499.699.8

100.0100.0100.0100.0

Blood Sugar level2 hrs-post meal

(mg %)

Sensitivity(%)

Specificity(%)

Page 41: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

41

ROC of accuracy of blood sugar test (2 hours post meal) to diagnosis DM [Public Health Service, Diabetes program guide Publ. No. 506, Washington DC, US Government Printing Office, 1960]

Specificity (%)

Diagnosis point

1 -

Sen

siti

vity

(%

)

Sen

siti

vity

(%

)[T

rue

po

siti

ve]

6.5B1 - Specificity (%)

Page 42: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

42

Combination of tests

To enhance sensitivity or specificity of the screening test

• Test in series: person is called “positive” when he tests +ve to all of a series of test, “negative” if he tests –ve to any. This enhances the SPECIFICITY of the test

• Test in parallel: person is labeled “positive” if he tests +ve to any of the tests, “negative” if he tests –ve to all. This enhance the SENSITIVITY of the test

Page 43: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

43

MULTIPLE TESTS CONCEPT

Types Step of event Result

Serialtesting

Paralleltesting

sensitivity specificity

sensitivity specificity

+

+

+

A

B

C

-

-

-

A B C+ + +

6.6B

Positive = all test +ve

Positive = any test +ve

Page 44: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

44

Perinatal HIV Outcome Monitoring System (PHOMS)

• Criteria for diagnosis of HIV status in children = uninfected

– HIV antibody negative at least 1 time in any age group; (serial or parallel)

– PCR negative at least 2 times at a different interval and last test must be after 2 months old (serial or parallel)

Serial to increase specificity

Parallel to increase sensitivity

Page 45: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

45

Perinatal HIV Outcome Monitoring System (PHOMS)

• Criteria for diagnosis of HIV status in children = infected– HIV antibody positive at least 2 times with

different technic, age > 18 months; (serial or parallel)

– PCR positive at least 2 times at a different interval in any age group (serial or parallel)

Serial to increase specificity

Serial to increase specificity

Page 46: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

46

Yield of screening test

• Predictive Value Positive (PVP) is likelihood that an individual with a positive test has the disease

PVP = a / a + b or = TP / TP + FP

• Predictive Value Negative (PVN) is likelihood that an individual with a negative test does not have the disease

PVN = d / c + d or = TN / TN + FN

This measurement is useful to M.D. especially PVP

Page 47: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

48

Binomial Mathematical Model

If p = prevalence of disease sens. = sensitivity of test spec. = specificity of test

Then; PVP = p(sens)

p(sens) + (1–p)(1-spec)

PVN = (1-p) spec

(1–p) spec + p(1-sens)PV or yield can be affected by Prevalence, and

Specificity and slightly affected by Sensitivity

Page 48: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

49

Results of screening test in two different populations:

sensitivity=.99, specificity=.99

Population A (prevalence = 100,000/1,000,000 = 0.10)

Disease+ Disease- Total

Test+ 99,000 9,000 108,000

Test- 1,000 891,000 892,000

100,000 900,000 1,000,000

PVP = TP / (TP+FP) = 99,000/(99,000+9,000) = .917 = 91.7%

PVP = (p)(sens) = (.1)(.99) = .917

(p)(sens)+(1-spec)(1-p) (.1)(.99)+(1-.99)(1-.1)

Page 49: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

50

Test results of screening test in two different

populations: sensitivity=.99, specificity=.99Population B (prevalence = 1,000/1,000,000 = 0.001)

Disease+ Disease- Total

Test+ 990 9,990 10,980

Test- 10 989,010 989,020

1,000 999,000 1,000,000

PVP = TP / (TP+FP) = 990/(990+9,990) = .090 = 9.0%

PVP = (p)(sens) = (.001)(.99) = .090

(p)(sens)+(1-spec)(1-p) (.001)(.99)+(1-.99)(1-.001)

Page 50: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

51

Relationship between prevalence of disease and predictive value, with sensitivity and specificity held constant at 95%

[adapted from Vecchio, 1966]

0102030405060708090

100

0 20 40 60 80 100

Positive test

Negative test

Prevalence of disease (percentage)

Pre

dict

ive

valu

e (p

erce

ntag

e)

12

Page 51: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

52

PVP as a function of prevalence, sens = .99

0102030405060708090

100

0 10 20 40 60 80 100

Positive test

Prevalence of disease (percentage)

Pre

dic

tive

val

ue

(per

cen

tag

e)

Spec = .99

Spec = .90

Spec = .80

Page 52: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

53

PVN as a function of prevalence, spec = .99

0102030405060708090

100

0 10 20 40 60 80 100

Negative test

Prevalence of disease (percentage)

Pre

dict

ive

valu

e (p

erce

ntag

e)

Sens = .99

Sens = .90

Sens = .80

Page 53: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

54

Reliability**Precision**Repeatability

Number of agreed positive= -------------------------------------

Number of positive either time

a= --------------

(a + b + c)

Page 54: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

55

Screening of breast cancer by Mammography

Cancer+ Cancer- Total

Test+ 31 108 139

Test- 24 20048 20072

65 20156 20211

Reliability = a / a+b+c = 31 / 31+24+108

= 31 / 163

= 19.0%

Page 55: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

56

Four sources of variability

• Biological variation (specimens)

• Variation due to the test method or measurement (test)

• Intra-observer variation (examiner)

• Inter-observer variation (examiners)

Page 56: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

57

Increase reliability through:

• Standardization of procedures

• Intensive training of observers

• Periodic quality control

• Use of 2 or more observers making independent observations

Page 57: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

58

Reliability and Validity of Instruments

A – reliable and valid C – reliable but not valid

B – not reliable but valid D – not reliable and not valid

Fre

quen

cy

A

B

C

D

True value Measurement

X

Page 58: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

59

2 X 2 table of screening tests,

definitions and formulas

Test

positive

negative

Diseasepresent absent

True positive False positive a + b

c + d

False negative

True negative

a b

c da + c b + d

a + b + c + d

+PV = a / a + b

-PV = d / c + d

Sensitivity

= a / a + c

Specificity

= d / b + dAccuracy= a+d / a+b+c+d

Prevalence= a+c / a+b+c+d

Page 59: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

60

Example 1: Screening of breast cancer by clinical examination

Clin exam

positive

negative

Breast cancerpresent absent

34

2000021

156

55 20156 20211

20021

190

Sensitivity = 34/55 X 100 = 61.3% Specificity = 20000/20156 X 100 = 99.2%

PVP = 34/190 X 100 = 17.9% PVN = 20000/20021 X 100 = 99.9%

Page 60: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

61

Example 2: Screening of breast cancer by mammography

Mammography

positive

negative

Breast cancerpresent absent

31

2004824

108

55 20156 20211

20072

139

Sensitivity = 31/55 X 100 = 56.4% Specificity = 20048/20156 X 100 = 99.5%

PVP = 31/139 X 100 = 22.3% PVN = 20248/20072 X 100 = 99.9%

Page 61: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

62

Example 3: Screening of breast cancer by clinical examination and follows by mammography

positive

negative

Breast cancerpresent absent

19

15515

1

34 156 190

170

20

PVP = 19/20 X 100 = 95%

Mammography

Page 62: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

63

Example of prevalence and PVGiven; Sensitivity = 95% Specificity = 95%

Test +

Test -

True+ True- Total

Suppose Prevalence = 10%

Test +

Test -

True+ True- Total

Suppose Prevalence = 20%

950 450 4001900

2000

1007600

8000

2300

7700

100001000090001000

1400

8600 508550

PVP =

PVN =

PVP =

PVN =

950/1400 = 67.9%

8550/8600 = 99.4%

1900/2300 = 82.6%

7600/7700 = 98.7%

Page 63: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

64

Breast cancer mortality rates at different times after start of follow-up between screened group (mammography)

and controls

No. of women with cancer

No. of deaths (from start to follow-up)

5 years 10 years 18 years

•Screened group•Control group

307

310

39

63

95

133

126

163

% difference 38.1 28.6 22.7

Source: Shapiro 1989

Page 64: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

65

SimpliRED VS. WB and EIA VS. WB

+

-

+ -362 1

0 2133

SimpliREDSensitivity = 100%Specificity = 99.95%

WB

Sim

pli

RE

D

+

-

+ -362 9

0 2125

EIASensitivity = 100%Specificity = 99.58%

WB

EIA

Conclusion

Page 65: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

66

Conclusion of SimpliRED and EIA

• SimpliRED is as sensitive and specific as EIA, but more expensive

• It had excellent correlation with the gold standard WB

• It provided rapid, accurate and on-site HIV status identification

• It required no equipment and minimal training

• For this study, it saved unnecessary CD4 testing of HIV samples

Page 66: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

68

Principles of good screening programs (1)

1. The condition being sought is an important health problem for the individual and the community. Since screening requires the commitment of large amounts of money, manpower, and other resources, screening should be undertaken only when it has the potential to lead to a significant decrease in rates of disability or death or both

Page 67: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

69

Principles of good screening programs (2)1. There is an acceptable form of treatment

for patients with recognizable disease. The goal of screening is to prevent disability or death or both. However, if there is no generally accepted treatment, it is premature to embark on a screening program

2. The natural history of the condition, including its development from latent to declared disease, is adequately understood. This is perhaps the most crucial of all the criteria in determining the feasibility of screening.

Page 68: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

70

Principles of good screening programs

4. There is a recognizable latent or early symptomatic stage

5. There is a suitable screening test or examination for detecting the disease at the latent or early symptomatic stage, and this test is acceptable to the population

Page 69: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

71

Principles of good screening programs4. The facilities required for

diagnosis and treatment of patients revealed by the screening program are available. Many screening programs have had little effect because planning for them did not include adequate and effective mechanisms for follow-up of positives

5. There is an agreed policy on whom to treat as patients

6. Treatment at the pre-symptomatic, borderline stage of a disease favorably influences its course and prognosis

Page 70: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

72

Principles of good screening programs9. The cost of the screening program

(which would include the cost of diagnosis and treatment) is economically balanced in relation to possible expenditure on medical care as a whole

10.Case finding is a continuing process, not a “once and for all” project. Some conditions, e.g., Phenylketonuria must be screened for once, early in life. Others should be monitored repeatedly. When repeated screening is necessary, empirical studies are needed to determine the optimal interval between screenings

[Wilson and Jungner, WHO 1968]

Page 71: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

73

Criteria for instituting screening program

Disease - Serious

- High prevalence of pre-clinical stage

- Natural history: undertood

- Long period between first signs and overt disease

Pre-test - Sensitive and specific

- Simple and cheap

- Safe and acceptable

- Reliable

Diagnosis - Facilities are adequate

and treatment - Effective, acceptable, and safe treatment available

Page 72: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

78

Questions ?

Page 73: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

79

Guideline for patient care

- Fever with cough or sore throat- History of sick/dead poultry past 7 days- History of pneumonia exposure past 10 days- Live in village with sick/dead poultry past 14 days

Isoated OPD: PE / History / Lab

Known cause Unknown cause

CXR, Rapid test, viral study

Normal CXR

Rapid test +

Abnormal CXR

antiviralRapid test + or severe symptomsSymptomatic Rx

Home care as appropriate

OPD reception

Patient

Specific Rx

Admit Rapid test -

NotifySurveillan

ce

Exercise AI screening test

Page 74: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

80

Page 75: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

81

Directigen Flu A + B

FLU OIA QuickVue Influenza

Test

ZstatFlu

Examples of Rapid Tests

Page 76: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

82

• Test Factors • Sensitivity- Proportion of positive tests by gold standard

that are also positive by screening test (true positives)

• Specificity- Proportion of negative tests by gold standard that are also negative by screening test (true negatives)

• Prevalence- Proportion of tested population with influenza

• Other factors• Type and quality of respiratory specimen• Day of illness when specimen was obtained• Compliance with test procedures• Interpretation of result

Factors Affecting Rapid Test Performance

Page 77: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

83

• Sensitivity: median = 70-75%

• Specificity: median = 90-99%• Calculated under ideal conditions• Most data are from children with

influenza A (H1N1) or A (H3N2)• Sensitivity to detect influenza A > B• No published data on H5N1

Summary of Published Data on Performance of Rapid Influenza Tests*

*Uyeki, T. 2003. Peds Infectious Disease (22) 164-77.

Page 78: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

84

New Uses for Rapid Tests In Thailand

• Febrile respiratory illness outbreak investigation

• Research – Outpatient disease burden– Seasonality– Cost of illness

• Expanded human influenza surveillance

• H5N1 avian influenza clinical management

Page 79: 1 SCREENING TESTS Dr. Khanchit Limpakarnjanarat Thailand MOPH – US CDC Collaboration (TUC)

85

Seasonality of Outpatient Influenza Using Rapid Tests; 2003-2004

29%

21%

0% 0% 0%

9%13%

27%

45%40%

12%

31%

0%

10%

20%

30%

40%

50%

Aug

Sep Oct NovDec Ja

nFeb Mar Apr

May Jun Ju

l

Month

Prop

ortio

n Po

sitiv

e