MSA Example: Attribute or
Categorical Data
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 2 .PPT
MSA Operational Definitions
Accuracy: Overall agreement of the measured value with the true value (which may be an “expert” value). Bias plus precision.
Attribute Data: Discrete qualitative data.
Attribute Measurement System: Compares parts to a specific set of criteria and accepts the item if the criteria are satisfied.
Bias: A systematic difference from the true value. Revealed in the differences in averages from the true value.
Precision: Variation in the measurement process.
R&R: Repeatability and Reproducibility. Two elements of precision.
Repeatability: The variation observed when the same operator measures the same item repeatedly with the same device.
Reproducibility: The variation observed when different operators measure the same parts using the same device, sometimes it can be the same operator using different devices.
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 3 .PPT
“Is the variation (spread) of the measurement system too large to study the current level of process
variation?”
+ =
(Observed Variability)
Total VariabilityProduct VariabilityProcess Variability
Variationin the
measurementprocess
The Fundamental MSA Question
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 4 .PPT
TrueValue or Standard
Bias
Observed Average
Possible Causes of Bias
Sensor not properly calibrated
Improper use of sensor
Unclear procedures
Human limitations
Bias
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 5 .PPT
Repeatability
Possible Causes of Poor Repeatability
Equipment Gage instrument needs
maintenance The gage needs to be more
rigid
People Environmental conditions
(lighting, noise) Physical conditions (eyesight)
Repeatability
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 6 .PPT
Reproducibility
Mean of the measurements
of Operator BMean of
the measurementsof Operator A
Possible Causes of Poor Reproducibility
Measurement procedure is not clear
Operator is not properly trained in using and reading gage
Operational Definitions not established
Reproducibility
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 7 .PPT
Discrete qualitative data
Go/no-go basis; or limited data categories
Compares parts to specific criteria for accept/not accept or to be placed in category
Must screen for effectiveness to discern good parts from bad
At least two appraisers and two trials each
If available, have Quality Master rate parts first
Attribute Measurement Systems Study
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 8 .PPT
Attribute MSA Study
Attribute MSA ExampleAppraiser A Appraiser B
Master 1 2 1 21 G G G G G2 G G G G G3 G NG G G G4 NG NG NG NG NG5 G G G G G6 G G G NG G7 NG NG NG G NG8 NG NG NG G G9 G G G G G10 G G G G G11 G G G G G12 G G G G G13 G G NG G G14 G G G G G15 NG G G G G16 G G G G G17 G G G G G18 G G G G G19 G G G G G20 NG G G G G
G = GoodNG = Not Good
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 9 .PPT
Challenges of Continuous Process MSA
MSA study is an experiment
Requires two or more trials for calculating Repeatability
Needs a way to present the inspection units to the appraiser multiple times
Is not possible within the continuous process
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 10 .PPT
Case Example: Visual Inspection of Glass
Catwalk
Glass
Inspector
Cutter
Packers
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 11 .PPT
Case Example: Challenges to Overcome
Bias to the standard could be evaluated on-line.
Repeatability and Reproducibility (R & R) could not be evaluated on-line.
A method had to be devised to allow the inspectors to view the same pieces of glass repeatedly.
The solution was an off-line conveyor which simulated the on-line condition as closely as possible.
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 12 .PPT
20 pieces of glass from the process that included both good and bad samples were selected.
A team of people well versed in the quality standard classified each piece of glass as either “pass” or “fail.”
All regular inspectors independently evaluated each piece twice (in random order).
The inspectors used a log sheet to record the data.
Minitab® was used to analyze the data.
Case Example: Attribute MSA Method Employed
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 13 .PPT
Excerpt of full data
for 20 inspectors
Case Example: Attribute MSA Study Data
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 14 .PPT
LarryClin
t
Ronn
ieJ o
hn
J ean
nie
Stev
eEa
rlTim
Moll
yTo
ny
100
90
80
70
60
50
40
Appraiser
Perc
ent
95.0% CIPercent
LarryClin
t
Ronn
ieJ o
hn
J ean
nie
Stev
eEa
rlTim
Moll
yTo
ny
100
90
80
70
60
50
40
Appraiser
Perc
ent
95.0% CIPercent
Date of study: 9/13 and 9/19/07Reported by: Bill Sunderland and Bob PanatieriName of product: SolarMisc: Mainline Marking
Assessment Agreement
Within Appraisers Appraiser vs Standard
Case Example: Attribute MSA Study Results
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 15 .PPT
Case Example: Attribute MSA Study Results (continued)
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 16 .PPT
Case Example: Attribute MSA Study Results (continued)
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 17 .PPT
Case Example: Attribute MSA Study Results (continued)
All Rights Reserved, Juran Institute, Inc. MSA for Continuous Processes 18 .PPT
What could have caused the poor agreement?
What was done to improve consistency?
Case Example: Attribute MSA Study Conclusions