13
40 | MEASURE www.ncsli.org TECHNICAL PAPERS 1. Introduction One of the significant developments in coordinate metrology over the last dozen or so years has been a growing interest in evaluating the uncertainty of results produced by three dimen- sional (3D) measuring systems, particularly coordinate measur- ing machines (CMMs). Several motivations underlie this developing concern. Principal drivers of this activity include the steadily increasing employment of CMMs as tools for product and process assessment, a generally increased concern for product quality, greater globalization of trade, increased compe- tition in the manufacturing environment and increasingly tighter tolerances for manufactured goods. Two reflections of this inter- est have been a growing body of research on the topic and the recent appearance of national and international standards dealing with or requiring the evaluation of measurement uncer- tainty. In this paper, the motivations and available methods for eval- uating task-specific uncertainties (i.e. an uncertainty applicable to a specific geometric dimensioning and tolerancing (GD&T) parameter of a designated part feature, under particular condi- tions of manufacture and measurement) in complex systems, such as CMMs, are discussed, pointing out the strengths and weaknesses of each. Then, simulation methods as applied to this problem, with an emphasis on the necessary and desirable fea- tures for a software application for CMM uncertainty evaluation are presented. Next, an implementation of software that embod- ies these features is described. Finally, a small set of application Application of Simulation Software to Coordinate Measurement Uncertainty Evaluations Jon M. Baldwin, Kim D. Summerhays, Daniel A. Campbell and Richard P. Henke Abstract: Uncertainty evaluations for coordinate measuring machine (CMM) metrology are problematic due to the number, ranges, interactions and generally unknown sensitivity coefficients of the parameters that can influence the measurement result. The situation is particularly difficult when a task-specific uncertainty is required and poses prob- lems for both auditors and metrology practitioners. Auditors often lack satisfactory tools for a comprehensive assess- ment of a client’s claims of traceability. Measurement professionals, similarly, have difficulty demonstrating compliance with measurement traceability requirements and, in addition, can find themselves at a real economic disadvantage if reliable measurement uncertainties are not known. In this paper, the historical perspective of, the motivations for, and the necessity of task-specific uncertainty evaluations are briefly discussed. This is followed by a presentation of the requirements and desirable features of a credible method for task-specific CMM uncertainty evaluation. Next, a descrip- tion of the major design features of a practical software application for evaluating uncertainties of CMM measurements are presented. This is concluded by presenting several application examples and case studies which demonstrate that, in the arena of task-specific CMM uncertainty evaluation, simulation methods exhibit notable strength and versatility. Jon M. Baldwin Kim D. Summerhays Daniel A. Campbell Richard P. Henke MetroSage, LLC 26896 Shake Ridge Road Volcano, CA 95689 USA Email: [email protected]

Baldwin, 2007

Embed Size (px)

Citation preview

Page 1: Baldwin, 2007

40 | MEASURE www.ncsli.org

TECHNICAL PAPERS

1. IntroductionOne of the significant developments in coordinate metrologyover the last dozen or so years has been a growing interest inevaluating the uncertainty of results produced by three dimen-sional (3D) measuring systems, particularly coordinate measur-ing machines (CMMs). Several motivations underlie thisdeveloping concern. Principal drivers of this activity include thesteadily increasing employment of CMMs as tools for product

and process assessment, a generally increased concern forproduct quality, greater globalization of trade, increased compe-tition in the manufacturing environment and increasingly tightertolerances for manufactured goods. Two reflections of this inter-est have been a growing body of research on the topic and therecent appearance of national and international standardsdealing with or requiring the evaluation of measurement uncer-tainty.In this paper, the motivations and available methods for eval-

uating task-specific uncertainties (i.e. an uncertainty applicableto a specific geometric dimensioning and tolerancing (GD&T)parameter of a designated part feature, under particular condi-tions of manufacture and measurement) in complex systems,such as CMMs, are discussed, pointing out the strengths andweaknesses of each. Then, simulation methods as applied to thisproblem, with an emphasis on the necessary and desirable fea-tures for a software application for CMM uncertainty evaluationare presented. Next, an implementation of software that embod-ies these features is described. Finally, a small set of application

Application of Simulation Softwareto Coordinate MeasurementUncertainty EvaluationsJon M. Baldwin, Kim D. Summerhays, Daniel A. Campbell and Richard P. Henke

Abstract: Uncertainty evaluations for coordinate measuring machine (CMM) metrology are problematic due to thenumber, ranges, interactions and generally unknown sensitivity coefficients of the parameters that can influence themeasurement result. The situation is particularly difficult when a task-specific uncertainty is required and poses prob-lems for both auditors and metrology practitioners. Auditors often lack satisfactory tools for a comprehensive assess-ment of a client’s claims of traceability. Measurement professionals, similarly, have difficulty demonstrating compliancewith measurement traceability requirements and, in addition, can find themselves at a real economic disadvantage ifreliable measurement uncertainties are not known. In this paper, the historical perspective of, the motivations for, andthe necessity of task-specific uncertainty evaluations are briefly discussed. This is followed by a presentation of therequirements and desirable features of a credible method for task-specific CMM uncertainty evaluation. Next, a descrip-tion of the major design features of a practical software application for evaluating uncertainties of CMMmeasurementsare presented. This is concluded by presenting several application examples and case studies which demonstrate that,in the arena of task-specific CMM uncertainty evaluation, simulation methods exhibit notable strength and versatility.

Jon M. Baldwin

Kim D. Summerhays

Daniel A. Campbell

Richard P. Henke

MetroSage, LLC26896 Shake Ridge RoadVolcano, CA 95689 USAEmail: [email protected]

Page 2: Baldwin, 2007

MEASURE | 41Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

examples and case studies are presentedthat demonstrate the validity and utilityof simulation methods for task-specificuncertainty evaluation of coordinatemeasurement systems.

2. Task-Specific Uncertainty:Historical Perspective andMotivation

There has been an interest on the part ofresearchers, going back at least to theearly 1990s [1], in methods to derivetask-specific measurement uncertaintyevaluations from more general CMMperformance parameters, and this efforthas since been carried forward byworkers in several countries. [2-6]Generic CMM performance indices

have been available for some time, themost prominent being those issued bythe International Organization for Stan-dardization (ISO) [7] and by the Ameri-can Society of Mechanical Engineers(ASME). [8] While generic CMM per-formance tests such as these are valuablein comparative assessments of coordi-nate measurement systems, they areincapable of evaluating task-specificmeasurement uncertainty.Meanwhile, task-specific uncertainty

evaluation has become firmly establishedas a necessary component process indemonstrating measurement traceabilityto national and international standards.ISO 17025 [9] emphasizes the impor-tance of uncertainty evaluation and,most importantly, states that traceabilityis achieved by means of an unbrokenchain of calibrations or comparisonswhich include the measurement uncer-tainty of each step. Similarly,ANSI/NCSL Z540 [10] states, “Calibra-tion certificates and/or reportsshall…state the traceability to…stan-dards of measurement and shall providethe measurement results and associateduncertainty of measurement …” ASMEB89.7.5 [11] sets out in explicit detailthe requirements for traceability ofdimensional measurements.The economic importance of uncer-

tainty evaluation has been furtheremphasized in recent standards, notablyISO 14253-1 [12] and ASME B89.7.3.1[13], which create guidance for the for-mulation of decision rules to govern theacceptance or rejection of articles of

commerce. These standards provide forpossible economic penalties for greatermeasurement uncertainty.

3. CMMs and Methods forEvaluating Task-SpecificUncertainty

The major factor that inspires the wide-spread application of CMMs in industrialdimensional metrology is their extremeversatility; no other dimensional measur-ing device is capable of determining sucha large variety of parameters on as largea range of workpiece types and sizes.Unfortunately, this same versatility leadsto difficulty when it is necessary to statemeasurement uncertainty. In addition tothe many different measurands that areevaluated in a typical CMM measure-ment session, one is confronted withalmost unlimited sources of variability inthe conditions of measurement: work-piece location and orientation, sensortype(s) and configuration(s), environ-ment, sampling strategy and computa-tional considerations, to name just a few.It is this tremendous variability that is atthe heart of the fact that typical CMMcalibrations and performance testscannot directly produce task-specific

measurement uncertainties.This complexity can be further appre-

ciated by referring to Fig. 1, which showsthe CMM measurement traceabilitychain in its entirety. Traceability is estab-lished in a multi-step process, going allthe way back in an unbroken sequence tonational or international standards. Eachstep contributes an uncertainty that mustbe considered in developing the final,task-specific uncertainty and traceabilitystatements. Some of these steps are rela-tively straightforward. Some informationmay be available to the CMM user withlittle required effort. For example, theuncertainties of the steps through theartifact calibration are likely assessed bynational or international measurementinstitutions and/or qualified calibrationsources, and should be captured on theartifact calibration certificate. Others,notably the CMM calibration and work-piece measurement, are complicated andare of necessity left in the hands of theCMM user, who might not be wellequipped for their evaluation.The available methods for CMM

uncertainty evaluation have been sum-marized in a draft ISO technical report.[14] They are:

Figure 1. Sources of uncertainty in the CMM traceability chain.

AtomicClock

CMMCalibration

WorkpieceMeasurement

andUncertaintyStatement

MeasurementTraceability

IodineStabilizedLaser

MetrologyLaser

ArtifactCalibrator

–A–

–B–8.500

32º

3.458

Point SamplingStrategy

CMM andProbing Errors

ThermalErrors

Part FormError

Page 3: Baldwin, 2007

42 | MEASURE www.ncsli.org

TECHNICAL PAPERS

1. Sensitivity Analysis, which involves listing each uncer-tainty source, its magnitude, effect on the measurementresult and its correlation with other uncertainty sources,then combining them in a manner that accounts for thesensitivity factor of each source and the interactionsbetween sources. This is the approach described in theISO Guide to Uncertainty in Measurements (GUM) [15]and is particularly useful if a mathematical model of themeasuring process can be had, because direct computationof the sensitivity coefficients is possible.

2. Expert Judgment, which may be the only available methodif a mathematical model or measurement data are notavailable. Its limitations in producing a defendable uncer-tainty statement are evident.

3. Substitution, wherein repeated measurement of a cali-brated master workpiece yields a range of errors and thusthe uncertainty. This is a powerful method of capturing therelevant error sources and their interactions. Its major dis-advantages are expense (need for multiple master parts)and a reduction of the range of utility of the CMM.

4. Computer Simulation, where virtual models of the CMM,the workpiece and the measurement process are created,and estimates of the individual error sources are provided.These data are then applied in repeated virtual measure-ments. The distributions of the task-specific results yieldestimates of both bias and measurement variability andhence uncertainty. Simulation methods are discussed in anew supplement to the GUM. [16]

5. Measurement History, which is useful if large numbers ofmeasurements over time are available. This method canplace an upper bound on measurement uncertainty. It failsto detect measurement bias.

Regardless of the method chosen to evaluate CMM uncer-tainty, there are a few requirements for a credible method. Aminimum set of requirements is:

1. The chosen method must be comprehensive, i.e., all themajor influence variables must be considered.

2. All necessary GD&T parameters must be supported.3. The evaluations of those parameters must conform to thedefinitions established by the appropriate national andinternational standards.

4. It must produce accurate and reliable results.Several other qualities are highly desirable. They include:1. The method should be versatile by supporting a usefulvariety of CMM and probing system error models, andworkpiece and CMM thermal models.

2. It should demonstrate fidelity by allowing realistic con-struction of measurement scenarios and metrology hard-ware configurations, and correct choice of geometricfitting algorithms.

3. It should be interoperable by accepting data from legacysources, e.g. existing workpiece designs and inspectionprograms, and should provide a defined interface for com-municating uncertainty information with other applica-tions.

4. It should be flexible, offering a spectrum of tradeoffsbetween cost of system characterization and quality of theresulting uncertainty evaluations.

Table 1 presents a comparison of the five techniques as theyapply to evaluation of task-specific uncertainty of CMM meas-urements, comparing them according to seven important prop-erties.Sensitivity analysis is rated questionable as regards tractabil-

ity and comprehensiveness, due to the need for explicit informa-tion on the standard deviation and sensitivity factor for everyuncertainty source and on the correlation between every pair ofuncertainty sources. In some cases sensitivity coefficient calcu-lation is impossible, since the measuring process cannot alwaysbe analytically described. Sensitivity analysis is rated as poor inregards to cost, due to the labor-intensive nature of the process,

Table 1. Uncertainty method scorecard for 3D metrology.

Sensitivity Analysis ????? ????? Strong Strong Weak Strong Weak

Expert Judgment Strong ????? ????? ????? Weak Strong ?????

Substitution Strong Strong Strong Strong Weak Weak Weak

Computer Simulation Strong ????? Strong Strong Strong Strong ?????

Measurement History Strong Strong Weak ????? Weak Weak Weak

Evaluation Method

Tractable

Comprehensive

Detects

Measurement

Bias

Detects

Measurement

Variability

Versatile

Predictive

Economical

Desireable Attributes of an Uncertainty Evaluation Method

Page 4: Baldwin, 2007

MEASURE | 43Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

and poor from a versatility perspectivesince much of the analysis is unique foreach individual application. Its strengthis that, properly conducted, sensitivityanalysis does a thorough job of detectingboth bias and variability.The strength of expert judgment is its

tractability; it can be applied to any situ-ation where the “expert” is confident.Conversely, the comprehensiveness ofexpert judgment is difficult to evaluateand open to question. Cost can varywidely. The versatility of the method isnot great; a separate consultation may berequired for each and every application.The ability to detect bias and variabilityis a strong function of the quality of theexpert; coverage of these issues is diffi-cult to document and defend.Substitution performs well except in

the areas of cost and versatility, whichare weaknesses of the method. Since cal-ibration of the artifact is required and, intypical CMM applications, a wide varietyof parts must be measured, this methodis generally uneconomical. Similarly, therequirement of an artifact reasonablysimilar, if not identical, to the workpiecemeans that the method lacks versatility.Measurement history is by definition

tractable and comprehensive. There areongoing costs associated with maintain-

ing and preserving the measurementdatabase. Its scope is limited to a partic-ular workpiece. Measurement bias isundetected. The method detects meas-urement variability but cannot distin-guish it from production processvariability.Computer simulation is easily applied

to a wide range of problems; generally,all the information required to set up asimulation is available from the work-piece and measurement process designs.It can be variable in comprehensiveness;the result is only as complete as themodel. This is primarily a concern to bedealt with during initial selection of thesimulation software. Versatility andtractability of simulation methods gohand in hand; both depend on the sameinformation sources. Finally, simulationmethods easily capture both bias andvariability of the measurement process.Simulation shares the strengths of sensi-tivity analysis but often allows a morecomplete assessment of interactionsbetween error sources.

4. Influence Quantitieson CMM Measurements

Earlier, it was mentioned that one of theprincipal sources of difficulty in evaluat-ing measurement variability is the

number and interactions of variables thatcan affect a CMM measurement. It isnow necessary to visit this topic moreexplicitly and in its full complexity.CMM measurement influence quantitiescan reasonably be categorized as illus-trated in Fig. 2. Within this categoriza-tion there are further levels of complexityas detailed in Table 2. It is important tonote that this table may not be compre-hensive, nor are all the influence vari-ables listed likely to be important inevery instance. The significant point isthat in any CMM measurement thesources of variation will be many and theirinteractions will frequently be complexand beyond the reach of analytical treat-ment. It is likely that in most cases a fewof these errors will predominate.Due to wide variations in measure-

ment systems, environments and meas-urement objectives, and to theprevalence of interactions between manyof these influence variables, it is notpractical to offer extensive generaliza-tions concerning their relative impor-tance. Thermal effects are commonlysignificant, as can be uncompensatedgeometric errors in the CMM. Dynamicgeometric effects become more signifi-cant as measuring speed is increased.Simple examples of the interactionbetween workpiece form errors and sam-pling strategies are well known. [17, 18]

5. Levels of CMM UncertaintyIt is necessary also to deal with the factthat our knowledge of an uncertaintysource is often incomplete. This can beillustrated by one example, CMM rigidbody geometric errors, frequently one ofthe most important error sources. Foreach axis, six functions of axis positionare needed: position error along the axis,straightness in the two orthogonal direc-tions, roll, pitch, and yaw, as well asthree scalar parameters, the out-of-squareness values for each axis pair,giving a total of 21 items in all for a threeaxis Cartesian CMM. Introduction of arotary axis would require a similar set ofsix functions plus two additional square-ness parameters. The discussion andexamples cited in this paper will addressthe most common case of three axisCartesian CMMs, but is extendable inprinciple to other CMM geometries.

Figure 2. General categories of CMM measurement influence quantities.

CMM Errors

EnvironmentalFactors

Fitting AlgorithmsProbe

System Erros

FeatureForm Errors

SamplingPatterns

∂F∂r

P = at w s ∫

∂F∂r

t

+ w c ∇2F ∇2F t a∫

Page 5: Baldwin, 2007

44 | MEASURE www.ncsli.org

TECHNICAL PAPERS

While all these parameters can bedetermined, the calibration process istime consuming, on the order of a weekor more, and requires calibrated arti-fact(s). Often, particularly in productionmetrology facilities, much less is typicallyknown. For example, the commonlyused B89 test suite characterizes CMMperformance by six scalar parameters: arepeatability parameter, three linear dis-placement accuracy values, volumetricperformance and offset volumetric per-formance. The only calibrated artifactrequired is a length standard. Obviously,the B89 results provide far too littleinformation to characterize fully therigid body errors of the CMM but the testcan be performed in about one to 1½days.There is a continuum of tradeoff

choices available, where greater effortand expense will produce higher qualitydata and smaller estimates for the uncer-tainties. It is reasonable to expect resultsbased on a full parametric characteriza-tion to lie toward the high cost, higherquality end of this range and that anuncertainty evaluation based on asimpler CMM performance test wouldproduce higher uncertainty values atlower cost.It should be noted that performance

test suites other than the B89 tests areavailable and are sometimes cited byCMM users and vendors. For example,the ISO 10360-2 tests are commonlycited by CMM vendors. Although thesystem described here will produceuncertainty evaluations based on avariety of such tests, including all of theabove-mentioned options, the examplesand discussion in this paper will focus onthe B89 tests. While either of these testsuites offer substantial time saving ascompared to a full parametric character-ization, it was observed that the B89tests generally result in uncertainty esti-mates that correspond more closely tothose derived from a full knowledge ofthe rigid body errors. Generally, theobserved order of the uncertainty esti-mates is full parametric specification <B89 < ISO 10360.The reasons observed for this are:1. The B89 tests provide six parame-ters to characterize CMM perform-

Table 2. Potential CMM measurement influence variables.

InfluenceCategory Influence Factor Typical Source(s)

CMM Geometry

Rigid Body Errors CMM Design/Construction,Maintenance

Quasi-static Errors Workpiece Loading

Dynamic Errors CMM Design,Operating Parameters

Scale Resolution CMM Design

Sensor System

Probe Type Availability,Operator/Programmer Judgment

Stylus Configuration Operator/Programmer Judgement

Calibration Strategy Control Software,Operator/Programmer Judgment

Stylus Bending Probe Selection

Approach Velocity Control Hardware

Probe Repeatability Probe Selection, Adjustment

Lobing Probe Design, Selection

Indexable HeadRepeatability Design, Maintenance

Scanning Force andSpeed

Control Hardware,Operator/Programmer Judgment

Filtering Hardware, Software Design

Environment

Thermal Effects CMM, Workpiece

External Vibration Facility Design

Humidity Facility Design, Weather

Atmospheric Pressure Facility Design, Weather

Power,Other Utility Variations Facility Design

Lighting, Ventilation System Facility Design

Workpiece Factors

Systematic Form Error Manufacturing Method

Distortion by Fixturing Operator Practice

Sampling Strategy Numbers and Locations ofSampling Points Operator/Programmer Judgment

Data Analysis Fitting Algorithm Choice Availability in Software,Operator/Programmer Judgment

Page 6: Baldwin, 2007

MEASURE | 45Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

ance while the 10360 tests,depending on how they are per-formed, provide one, two or threeparameters. Thus the CMM modelis significantly more completelydetermined by the B89 tests.

2. The B89 volumetric performancetest explicitly specifies the locationsand orientations in the CMMworking volume of the test artifact,while the ISO 10360 protocolleaves this matter to the user’s judg-ment.

Whatever the basis for the CMM per-formance evaluation, the result is an esti-mate of the uncertainty of a single pointmeasurement and is an insufficientmetric for demonstrating traceability orassessing conformance to a specification.Ordinarily the value will vary through-out the CMM working volume. At eachpoint in the working volume, there willbe a distribution of measured values.Each time a point is measured somevalue will be drawn from the distributionof values for that region of the workspace. This value must be combined withthe other error sources to give theoverall point uncertainty and those pointuncertainties propagated through themeasurement process to give a completeuncertainty statement.

6. Description of the MethodAt the heart of our implementation is aNational Institute of Standards and Tech-nology (NIST) developed method calledSimulation by Constraints (SBC). [3]The choice of method was based largelyon the desire for flexibility to the user incost/benefit tradeoffs, as mentionedearlier. SBC provides this by allowingsimulations to be set up and executed inthe face of incomplete information. Thiscan be seen by reference to Fig. 3 where,for the purpose of illustration, it focuseson just one aspect of CMM uncertaintyevaluation, the effect of rigid bodymechanical errors. The method beginswith the recognition that the informationavailable to define the uncertainty sourcemay be incomplete; in this case, using aCMM performance test that does notcompletely define the CMM geometry.For example, there usually will be manysets of 21 rigid body parameters thatwould result in the same discovered setof six B89 parameters, the BoundingMeasurement Set (BMS). The SBCmethod would begin with the generationof an adequate number (typically hun-dreds) of rigid body parameter sets thatresult in B89 numbers near the BMSvalues. Each of these sets of 21 parame-ters (3 scalars and 18 functions) can be

thought of as a virtual CMM. For eachvirtual CMM the error of each individualpoint measured on the workpiece is com-puted. These points (with their errors)are submitted to the CMM data process-ing algorithms to obtain the correspon-ding substitute geometries for all themeasurement features of concern. Thesubstitute geometries are used to evalu-ate all the GD&T parameters of interestand the bias and range of the distributionof the results for each parameter pro-vides its measurement uncertainty. Theextension of the SBC concept to othererror sources is straightforward.

7. System ArchitectureThe software architecture that was devel-oped for task-specific uncertainty evalu-ation is shown in Fig. 4. One of theobjectives of this architecture is to lever-age, as much as possible, commercial, offthe shelf software capabilities. An imple-mentation of this architecture(PUNDIT/CMM™) has been created.Central to the implementation is the def-inition of the workpiece, represented inthe context of a 3D geometric modelingkernel. This kernel supplies geometricmodeling services. The current imple-mentation is based on the ACIS® solidmodeler,1 although another geometrykernel could be substituted. Surroundingthis kernel is the dimensioning and toler-ancing layer which provides the essentialservices of identification, assignmentand checking of tolerance features,datum reference frames and tolerances.This functionality is provided byFBTol®.2 The services provided by thedimensioning and tolerancing layer fulfillmany of the metrology-related functionscurrently neglected or inadequately pro-vided by all known computer-aideddesign (CAD) geometry kernels; specifi-cally the abilities to associate the rawgeometry with the workpiece features tobe toleranced and measured, to unam-biguously assign tolerances to those fea-tures, and to create and associate datumreference frames. Finally, the outermost

Measurement Dataforming BoundingMeasurement Set

One-to-many mapping viaData fitting to Kinematic Eqns

Population of PossibleSubstitute GeometryErrors is Expressed asMeasurand Uncertainty

One-to-one mapping viaFitting algorithm forSubstitute Geometry

One-to-one mapping viaKinematic Eqns

Population ofgood virtualCMM states

Point CoordinateErrors

Substitute GeometryErrors

Substitute GeometryUncertainty

Figure 3. Principles of uncertainty evaluation by simulation by constraints, using CMMkinematic errors as an illustration.

e.g. B89.4.1 data

1 ACIS is a registered trademark of SpatialTechnologies, Inc.

2 FBTol is a registered trademark of Honey-well Federal Manufacturing & Technologies.

Page 7: Baldwin, 2007

46 | MEASURE www.ncsli.org

TECHNICAL PAPERS

software, or interface and special services, layer provides CMMmetrology-specific functions, such as the simulation engine, thegeometric fitting and tolerance evaluation algorithms, themodels of the various components of the CMM metrologyprocess, and the user interface which allows parameterizationof the measurement process.The operation of this architecture in a typical simulation run

might be as follows. As mentioned above, the outer software(interface and services) layer provides most user interface func-tions; for conciseness, this involvement is not always specificallyidentified. A workpiece model is imported by the interface layerand is passed to the geometry kernel. The dimensioning and tol-erancing layer operates on the model, generally using a combi-nation of tolerancing “wizards” and user interaction, to identifytolerance features, supply datum information and tolerances.This is an essential first step, after which most of the remainingoperations may be performed in any convenient sequence. Theinterface and services layer creates the CMM and probingsystem models, and the environment model. The dimensioningand tolerancing layer provides services for creating the measure-ment plan, i.e., how the features are to be probed, fitting algo-rithm selections, etc. The interface and services layer is used tosupply the manufacturing information, i.e., the shape and ampli-tude of the form errors for each feature. Finally, the interfaceand services layer provides the simulation engine, drives thesimulation run, and presents the results.

8. User Interface DesignThe user interface is crucial to the usability of the uncertaintyevaluation system, in that a careful and logical presentation ofthe measurement process greatly facilitates straightforward andnatural-seeming development of measurement scenarios. At thehighest level, the user interface is organized into seven“tabbed” activities, each representing a distinct aspect of the

measurement process. These tabs are presented at the bottom ofthe display screen (see Fig. 5) and, with the exception of theworkpiece tab, may be visited in any desired order.Examples of each of the tabbed pages are presented in Fig. 6.

From left to right and top to bottom, these are:a. The Workpiece Tab. Here, the as-designed workpiecegeometry and the associated tolerance requirements aredefined. Generally, the workpiece model will be created inand, imported from, an independent CAD system,although a rudimentary facility, suitable for modelingworkpieces of simple geometry, is provided. A graphicalview of the model occupies the right-hand pane of the tab.Once the model has been imported or created, tolerancingis applied. This may be done automatically using built-insoftware wizards, interactively by the operator, or with acombination of these methods. If a legacy inspectionprogram is available in the Dimensional Measuring Inter-face Standard (DMIS) [19] format, the feature, datum andtolerance information can be extracted and automaticallyimported. In any case the tolerance information, includingtolerance features, datum reference frames, material con-dition modifiers and applied tolerances, is displayed inlists in the left-hand pane. Also in this tab, the tolerancescheme applied to the workpiece can be verified. A suiteof tools is available that can be used to automaticallydetermine if any aspect of the part design is over or underconstrained and if tolerance definition is complete. Thiscapability, although not essential for uncertainty evalua-tion, has value from the very beginning of the workpiecedesign and tends to encourage concurrent development ofthe design and the measurement.

b. The Manufacturing Information Tab. The previous tabwas concerned with the part as designed; this one dealswith the part as manufactured. Specifically, it accounts forthe fact that no manufacturing process creates features ofideal shape. Many of the shape errors are systematic andcharacteristic of the manufacturing methods and parame-ters. [20] It is well established that the interaction of shape(form) errors and the sampling pattern used in the CMMmeasurement can be a significant source of uncertainty.[21] PUNDIT/CMM has available several ways of apply-ing form errors to each feature; one of them is shown inFig. 6. The available methods are:1. User Query, where someone sufficiently knowledgeableabout the manufacturing process can apply combina-tions of specific functional shapes, e.g. lobing, taper,bellmouth, twist, etc., and random error.

2. Dense Data, where, if one or more samples of theactual production have been carefully and completelymeasured, the discovered form errors can be applied inthe uncertainty evaluation.

3. Manufacturing Process, where a library of form errorscan be assembled and reused as needed.

It is also worth noting the pair of radio buttons near thetop center of the screen. These allow perfect workpieceform to be temporarily applied, and are useful in doing“what if?” types of analysis, in this case determining how

Figure 4. Uncertainty evaluation system architecture.

Results

Features Tolerancing

Datum ReferenceFrames

3DGeometricModeler

SimulationEngine

FittingAlgorithms

CMMModel

ProbeModel

Mfg.Errors

SamplingPatterns

EnvironmentalFactors

Page 8: Baldwin, 2007

MEASURE | 47Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

much form error contributes to thetotal uncertainty. Similar capabili-ties will be seen for other influencequantities.

c. The CMM Tab. Here, the CMMmodel is defined. The CMM design

(e.g. bridge, cantilever, etc.) andaxis stacking are designated, alongwith the working volume and homelocation. A CMM performancemodel is chosen. Currently avail-able are perfect machine, B89

model, full parametric model, theISO 10360 model and an extendedversion of the latter, currentlyunder consideration for ISO adop-tion. A database of CMM perform-ance parameters is provided andinitially is populated with selectedmanufacturer-published informa-tion. The database can be updatedto include new machines and user-determined performance parame-ters for specific CMMs. There isalso a dialog for entering measuringvelocities, accelerations, etc. Theseare required when the transientthermal model is invoked, as willbe described in a following section.

d. The Probe Tab. PUNDIT/CMMcurrently accommodates contactprobes. There are models forswitching and piezo probes, as wellas a perfect probe option. Allowedprobe configurations are fixedsingle tip, fixed multitip, and artic-ulated single tip. The probe per-formance test is also chosen here,the options being the ASMEB89.4.1, ISO 10360, andVDI/VDE [22] tests as well as anextended version of the ISO tests.

e. The Environment Tab. Thermaleffects are almost always the pre-dominant environmental source oferror in CMM measurements. Twobasic thermal models are sup-ported: a static model where tem-perature is constant throughoutthe measurement and one whichallows the workpiece temperatureto change in the course of the meas-urement. Within each of thesemodels, a selection is available thatallows for several levels of temper-ature compensation: none, com-pensation for CMM temperatureonly, compensation for CMM andworkpiece temperature where bothare assumed to be at the same tem-perature, and full compensationwhen part and CMM may be at dif-ferent temperatures. In recognitionof the fact that some highly accu-rate CMMs employ laser interfer-ometry to determine CMMposition, environmental effects oninterferometric scales can bemodeled, as well.

Figure 5. Highest level user interface.

Figure 6. User interface illustrations.

a. Workpiece Tab b. Manufacturing Information Tab

c. CMM Tab d. Probe Tab

e. Environment Tab

g. Results Tab

f. Measurement Plan Tab

CMM Probe Environment Measurement Plan Manufacturing Info ResultsWorkpiece

Page 9: Baldwin, 2007

f. The Measurement Plan Tab. In this tab, the numbers anddistributions of measurement points, the order of meas-urement, the position of the workpiece in the CMMmeas-urement volume, the fitting algorithm and the probeselection are specified. Point placement on a feature canbe specified in a variety of ways: points can be placed man-ually, a variety of regular patterns can be applied by auto-mated routines, and optimized patterns [21] designed togive the best results in the face of feature form errors canbe imported as can point patterns from existing DMISmeasurement programs. Edge offsets can be specified forthe automated pattern routines, and points falling intovoids on the model are automatically rejected. If needed,DMIS code for measurement of each feature can be pro-duced here.

g. The Results Tab. Here, the uncertainty evaluation analy-sis is conducted and the results displayed. Along the leftside of the window is a tree that lists the tolerances thathave been applied and the features to which they belong.To the right is a pane which will display a histogram of theerrors for whatever feature/tolerance combination ischosen on the left. Along the bottom are controls forselecting the number of simulations to be run, and display-ing the progress of the run. For a part of moderate com-plexity, a series of a few hundred simulations can be runin 60 seconds or so. All of the feature/tolerance pairs areanalyzed in each simulation run, so all the results areimmediately viewable by selecting the appropriate combi-nation from the tree. A text output of the analysis and“screen grabs” of the histograms are also available.

9. Applications and Case StudiesPresented here are some simple but useful application casestudies produced with the uncertainty evaluation system.

9.1 Validation of the MethodSimulation software for evaluating CMM task-specific measure-ment uncertainty is complex and comprehensive testing for val-

idation is similarly so. Validation methodologies have been thesubject of recent research [23] and have begun to be codified.[24] Considered here are some examples of these methods asapplied to PUNDT/CMM.

9.1.1 Physical MeasurementsThe effectiveness of uncertainty evaluation software can begauged by showing that for calibrated parts, the observed meas-urement errors are reasonably bounded by the computed uncer-tainties. A study of this sort, using PUNDIT/CMM, wasconducted by metrologists at the National Institute of Standardsand Technology and at the Oak Ridge Metrology Center, usingvarious artifacts and CMMs. [23] A typical example from thatstudy is presented here. A calibrated 300-mm diameter disk waspositioned in a variety of locations and orientations in the meas-uring volume of a high accuracy CMM and, using a fixed sam-pling pattern, measured for diameter and circularity thusproviding a set of known measurement errors. The B89.4.1 per-formance test suite and ISO probe tests were also conducted onthis same CMM and used to define the CMM performance toPUNDIT/CMM. Some of the results are presented in Fig. 7. Theerrors at all positions and locations are bounded by the uncer-tainties predicted by the software. It is not surprising that theuncertainties provide somewhat loose bounds, considering thatthe CMM performance was specified by only the six B89 param-eters. Better information on the CMM, e.g. a full parametricspecification of the rigid body errors, would be expected to yielda tighter bounding, as can be illustrated using virtual calibratedartifacts. An example is presented in Figs. 8 and 9. Figure 8shows the part and the set of five representative GD&T param-eters used for comparison. Figure 9 compares the full paramet-ric specification results (FPS) to the corresponding valuesdeduced from only (B89) performance test data. The calcula-tions here were based on deactivating all error sources otherthan the CMM itself. The simulation results based on B89 per-formance testing bound those based on the full parametric spec-ification by the same order as they bound the actualmeasurement results in Fig. 7.

48 | MEASURE www.ncsli.org

TECHNICAL PAPERS

Diameter

Artifact position (arbitrary) Artifact position (arbitrary)

3.5

3

2.5

2

1.5

11 2 3 4 5

Circularity

7

6

5

4

3

2

1

0 1 2 3 4 5

Unc

ertainty

orError,µ

m

Unc

ertainty

orError,µ

m

Figure 7. Comparison of simulated uncertainty and measurement error.

x

x

x x

xx

x x

x

x

Simulated uncertainty Measured errorx

Page 10: Baldwin, 2007

MEASURE | 49Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

9.1.2 Reference Values andComponent Testing

A reference value is a special case resultthat has a known value, which can betested for in simulation. PUNDIT/CMMincorporates two features that allowcomponent testing for reference valueswithin the integrated software environ-ment. The first of these is the fact that amodel of the geometrically-perfect work-piece of interest is available. This meansthat, in effect, the software can be testedusing a nearly infinite set of virtual cali-brated parts. The second feature is theability to temporarily deactivate entireclasses of influence quantity. This allowsa focus on particular contributors touncertainty, where it is relatively easy todevelop test cases and to verify that com-puted results are compatible with antic-ipations. Described here are just twoexamples, one nearly trivial and the othersomewhat more complicated, bothinvolving use of reference values.In the first instance a 500-mm step

gage part was measured by simulationunder conditions where all error sourcesother than thermal ones were turned off.The step gage was stipulated to have acoefficient of thermal expansion of pre-cisely 10 ppm/ºC and a temperature ofexactly 21 ºC. The simulations showedsystematic length errors of 1 µm at100 mm, 2 µm at 200 mm, etc., just asanticipated.As a second example of focus on spe-

cific error sources, consider a 10 mminside diameter cylinder sampled at threeequiangular points at each of three levelsalong its length (the points at each leveleclipse those of other levels when thecylinder is viewed end-on). The cylindersurface was specified to have a 3-lobe(sinusoidal) form error (i.e., cylindricity)of peak-to-peak amplitude 10µm. TheCMM, probe and thermal conditions areset to “perfect.” PUNDIT/CMM thenpredicts a cylindricity bias (mean error inmeasurement) of the (negative) fullamplitude (– 10 µm) and no variabilityaround this. To see why this is so, notethat in simulation, the phasing of theform error to that of the samplingpattern is randomized on each cycle. Butbecause the form error here is entirelysystematic, and the number of sample

Figure 8. Workpiece and tolerances used in case study discussed in Section 9.1.1.

Figure 9. Relative estimated uncertainties for FPS and B89 CMM models.

∅ 0.06 D 0.08 A D B

∠ 0.04 A ∅ 25.00 ± 0.02 ∅ 0.06 A D B

FPS

CONC PROF ANGR POS DIAM

Unc

ertainty

(microns)

B89

5

4

3

2

1

0

Figure 10. Distribution of diameter errors for cylinder with three lobes, sampled at threeequiangular positions at each of three levels.

10.000dHole – DiameterMean Error = 0.000474 mm Std Dev = 0.006763 mm

0.00

8

0.00

6

0.00

4

0.00

2

0.00

0

0.00

2

0.00

4

0.00

6

0.00

8

Probab

ility

ofError

Error

Page 11: Baldwin, 2007

50 | MEASURE www.ncsli.org

TECHNICAL PAPERS

points at each level is equal to the number of lobes, the cylin-dricity obtained in the simulated measurements is always zeroregardless of the phasing. Thus on each cycle, the bias [= (mea-sured cylindricity) – (true cylindricity)] is the full – 10µm.Whatever the phasing of the lobing to the points, the measuredcylinder center location should remain fixed andPUNDIT/CMM predicts zero bias and zero variability in meas-ured position. Finally, the measured cylinder radius can beexpected to range above and below its nominal value by half ofthe cylindricity, depending upon the phasing of the lobing to thesampling. The measured diameter would thus range over the fullcylindricity (10 µm). Furthermore, the distribution of the diam-eters can be expected to be quite non-Gaussian, since the sinu-soidal lobing form error yields a notably higher sampling rate atits extrema than at intermediate values. This is just whatPUNDIT/CMM shows in its error histogram (Fig. 10). Severalof the examples that follow may be considered as instances ofvalidation by component testing, as well as demonstrations ofthe importance of specific error sources.

9.2 Relative Importance of Error SourcesThis example illustrates the relative contributions of severalerror sources to overall uncertainty. The effects of CMM geom-etry errors, probe errors and thermal nonidealities were consid-ered. The CMM linear accuracies were 3.0 µm, 2.1 µm and 2.5µm for the x, y and z axes, respectively. The volumetric perform-ance was 7.2 µm. Offset volumetric performance was 7.1 µm/mand the repeatability was 1 µm. The piezoelectric probe wasmodeled with a random error (σ) of 5 µm. The CMM scaleswere considered to be temperature insensitive; the workpiecewas aluminum, temperature compensated with an expansioncoefficient of 22 ± 2 ppm/ºC and a temperature of 25 ± 3 ºC.There was no form error added to either the measured cylindri-cal feature nor to the datum features, and the datum featureswere assumed to be perfectly measured. These last points areimportant since, generally, errors in the datum features willpropagate into the uncertainty of the position and orientation ofthe feature under consideration.The computed uncertainty is shown separately in Fig. 11 for

size, location and form, and for every possible combination oferror sources. The notation “101” signifies, for example, thatCMM and thermal errors were considered in that particularexperiment, but not probe error. In each case 300 simulatedinspections were performed. While the interrelationships of theerror sources were not treated explicitly, their relative independ-ence can be seen from the fact that the uncertainty values fromtreatment of two error sources at once are approximately equalto the root-sum-of-squares of the uncertainties from the errorsources treated singly.

9.3 Form Error/Sampling Pattern InteractionsThis example illustrates the interaction between feature formerrors and sampling strategy. Again, the measurement of a cylin-drical hole has been simulated, but it is now assumed that the partis measured with a perfect CMM and probe and that there are nothermal effects. A three lobe form error of 0.4 µm amplitude wasassumed in one experiment and a combination of three lobe errorand random error, each of 0.2 µmmaximum amplitude was usedin another. The sampling patterns all used even numbers of pointsarranged at two levels near the ends of the cylinder and evenlyspaced at each level. The “eclipsed” patterns (labeled “E” in Fig.12) have the points at each level placed at the same angular posi-tions while in the “staggered” patterns (labeled “S”) the pointpositions on one level are rotated by half the angular spacing rel-ative to the other level. Notice the oscillatory behavior withmaxima when the number of points at each level is an integermultiple of the lobing frequency. The staggered pattern helps todamp the effect but even then it persists to fairly large numbersof sampled points and can be seen even in the “mixed” caseswhere the random error is as large as the systematic.

9.4 Effect of Errors in Datum MeasurementIt is generally understood that errors in the measurement ofdatum surfaces will be reflected in measurements made relativeto those datums. CMM users may appreciate less well the mag-nitude of the effect. This example provides a simple illustration.The workpiece is a rectangular block; the feature of interest isa cylindrical hole centrally located in one face of the block. The

Figure 11. Contributions of several influence variables to overall uncertainty.

Size

Cylindricity

000 001 010 011 100 101 110 111

Unc

ertainty

(microns)

Factor Combinations (CMM-Probe-Thermal)

Position

0.035

0.030

0.025

0.020

0.015

0.010

0.005

0

Page 12: Baldwin, 2007

MEASURE | 51Vol. 2 No. 4 • December 2007

TECHNICAL PAPERS

hole is toleranced positionally relative tothree planar datum surfaces, each ofwhich is considered to have been manu-factured with 100 µm of random formerror. All else about the measurement isconsidered perfect. Figure 13 shows therelation of positional uncertainty of thehole as a function of sampling density ofthe datum surfaces. Suppose, forexample, that the hole is assigned a posi-tional tolerance of 0.5 mm. Remember-ing that in a real measurement there willbe other sources of error, it might be con-sidered wise to keep the error from thisparticular source to 10 % or less of thetolerance. This requires a minimum ofabout 250 probing points per datumsurface.

10. ConclusionsThe design of a comprehensive systemfor evaluating the task-specific uncer-

tainty of measurements made withCMMs has been described, and examplesof its application have been shown. Withreplacement of some of the CMM-spe-cific models, it could be adapted to other3D metrology systems, for example,articulated arm CMMs and laser theodo-lites. The software has been shown to berobust and versatile. It can be of use bothto auditors and measurement profession-als when demonstrated traceability ofCMM results is required. Measurementpractitioners will also find the system ofvalue in identifying and reducing thesources of uncertainty in their measure-ments, resulting in economic benefit.The software also can be beneficial in avariety of subsidiary functions, includ-ing:a. Verifying that tolerance applica-tions are complete, consistent andunambiguous;

b. Making the right choice when pur-chasing a CMM;

c. Finding and fixing the “weak link”in a CMM system;

d. Choosing the best CMM for a spe-cific job; and

e. Training users in proper CMMmeasuring procedures.

11. References[1] E. Trapet, F. Wäldele, and U. Wiegand,

“Coordinate Measuring Machines in theCalibration Chain,” CNAM (Conserva-toire National des Arts et Métiers),Paris, 1994.

[2] E. Trapet and F. Wäldele, “The virtualCMM concept,” Advanced MathematicalTools in Metrology II., P. Ciarlini, M. G.Cox, F. Pavese and D. Richter, eds., Sin-gapore, World Scientific, vol. 40, pp.238–247, 1995.

[3] S.D. Phillips, B.R. Borchardt., et al.,“The Calculation of CMMMeasurementUncertainty via The Method of Simula-tion by Constraints,” Proceedings of the12th Annual Meeting of the AmericanSociety for Precision Engineering,Norfolk, VA, pp. 443–446, October1997.

[4] A. Weckenmann and M. Knauer, “Theinfluence of measurement strategy onthe uncertainty of CMM-measure-ments,” Annals of the CIRP (CollegeInternational pour la Recherche en Pro-ductique), vol. 47, no. 1, pp. 451–454,1998.

[5] A. Balsamo, M. Di Ciommo, R. Mugno,B.I. Rebaglia, E. Ricci, and R. Grella,“Evaluation of CMM Uncertaintythrough Monte Carlo simulations,”Annals of the CIRP, vol. 48, no. 1, pp.425–8, 1999.

[6] B. Van Dorp, H. Haitjema, et al.,“Virtual CMM using Monte Carlomethods based on frequency content ofthe error signal,” Proceedings of theSPIE, vol. 4401, pp. 158–167, 2001.

[7] “Geometrical product specifications(GPS)–Acceptance and reverificationtests for coordinate measuring machines(CMM),” ISO 10360 Parts 1–5, Interna-tional Organization for Standardization,Geneva, 2000.

[8] “Methods for Performance Evaluation ofCoordinate Measuring Machines,”ASME B89.4.1, American Society ofMechanical Engineers, New York, 1997.

Figure 12. Interaction of sampling strategy with feature form.

Figure 13. Effect of datum form error on probe point density required to achieve a speci-fied positional uncertainty for a dependant feature.

# of Sample Points

Unc

ertainty

(mm)

0 5 10 15 20 25 30

0.0008

0.0006

0.0004

0.0002

0

# of Sample Points

Unc

ertainty

(mm)

0 200 400 600 800 1000 1200

0.3

0.25

0.2

0.15

0.1

0.05

0

3-Lobe E

3-Lobe S

Mixed E

Mixed S

Page 13: Baldwin, 2007

52 | MEASURE www.ncsli.org

TECHNICAL PAPERS

[9] “General Requirements for the Competence of Testing and Cali-bration Laboratories,” ISO 17025, International Organization forStandardization, Geneva, 1999.

[10] “American National Standard for Calibration–Calibration Labora-tories and Measuring and Test Equipment–General Require-ments,” ANSI/NCSL Z540–1, National Conference of StandardsLaboratories, Boulder, CO, 1994.

[11] “Metrological Traceability of Dimensional Measurements to the SIUnit of Length,” ASME B89.7.5, American Society of MechanicalEngineers, New York, 2006.

[12] “Geometrical Product Specifications (GPS)–Inspection by Meas-urement of Workpieces and Measuring Equipment–Part 1: Deci-sion Rules for Proving Conformance or Non-Conformance withSpecifications,” ISO 14253–1, First Edition, Geneva, 1998.

[13] “Guidelines for Decision Rules: Considering Measurement Uncer-tainty in Determining Conformance to Specifications,” ASMEB89.7.3.1–2001, American Society of Mechanical Engineers, NewYork, 2002.

[14] “Techniques of Determining the Uncertainty of Measurement inCoordinate Metrology,” ISO Technical Report 15530–1 (draft),July 16, 1998.

[15] “Guide to the Expression of Uncertainty in Measurement,” ISOGuide 98, Geneva, 1995.

[16] “Guide to the Expression of Uncertainty in Measurement, Supple-ment 1, Propagation of distributions using a Monte Carlomethod,” ISO Guide 98 Supplement 1, (final draft),Geneva, 2006.

[17] S.D. Phillips, “Performance Evaluations,” Coordinate Measuring

Machines and Systems, J. Bosch, ed., Marcel Dekker, 1995.[18] R. Edgeworth, R. Wilhelm, “Characterization of workpiece form

uncertainty for uniform sampling patterns,” North American Man-ufacturing Research Conference, New York, May 2005.

[19] “Dimensional Measuring Interface Standard,” ANSI, Revision 5.0,CAM-I 105.0–2004, Bedford, TX, 2004.

[20] R.P. Henke, K. D. Summerhays, J.M. Baldwin, R.M. Cassou, andC.W. Brown, “Methods for evaluation of systematic geometricdeviations in machined parts and their relation to process vari-ables,” Precision Engineering, vol. 23, no. 4, pp. 273–292, 1999.

[21] K.D. Summerhays, R. P. Henke, J.M. Baldwin, R.M. Cassou, andC.W. Brown, “Optimizing discrete point sample patterns andmeasurement data analysis on internal cylindrical surfaces withsystematic form deviations,” Precision Engineering, vol. 26, no. 1,pp. 105–121, 2002.

[22] “Accuracy of coordinate measuring machines: characteristicparameters and their checking,” VDI/VDE 2617, part 2.1, Dussel-dorf, Germany, 1986.

[23] S.D. Phillips, B. Borchardt, A.J. Abackerli, et al., “The Validationof CMM Task Specific Measurement Uncertainty Software,” Pro-ceedings of the American Society for Precision EngineeringSummer Topical Meeting on Coordinate Measuring Machines,Charlotte, NC, June 2003.

[24] “Geometric Product Specification (GPS)–Coordinate MeasuringMachines (CMM)–Techniques for determining the uncertainty ofmeasurement–Part 4: Estimating task-specific uncertainty usingsimulation,” ISO 15530-4 (draft), Geneva, October 2004.