Design for Six Sigma Paper 3

  • Upload
    syllim

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

  • 7/28/2019 Design for Six Sigma Paper 3

    1/3

    any organizations believedesign for Six Sigma (DFSS)is a design process when

    really it is not. DFSS is an enhance-ment to an existing new productdevelopment (NPD) process thatprovides more structure and a bet-ter way to manage the deliverables,resources and trade-offs.

    DFSS is the means by which weemploy strategies, tactics and tools toenhance an existing design process toachieve entitlement performance. Itintegrates qualitative and quantita-tive tools and key performance mea-sures that allow progressiveorganizations to manage an NPDprocess more effectively to optimizeseveral competing key drivers, such

    as cost, quality and time to market.A typical NPD process might

    include several high-level develop-ment phases, such as the seven-stepsystems engineering model: needsassessment, concept design, prelimi-nary design, detail design, processdesign and construction, manufac-turing and end of life. Most organi-zations have instituted managementreviews called checkpoints or toll-gates throughout these design phas-es to allow the organization to

    assess risks, monitor progress andensure transitions from each phaseto the succeeding phase are war-ranted.

    Focus on risk management in the early phases

    Because quantitative measures ofperformance may not exist until thephysical product is prototyped or theservice is piloted in the later designphases, all that exists early in theNPD process is the risk of futureproblems. This is why progressiveorganizations focus on risk manage-ment in the early phases of an NPDprocess rather than on the applica-

    ing at the micro level. Our technicaltraining is centered around a four-step methodology that provides ageneral approach to the applicationof DFSS at the project level and sub-tasks within a design project.

    The ICOV approach

    The DFSS approach we use at themicro level consists of four majorsteps, known as ICOV (see Figure 1):1. Identify.2. Characterize.3. Optimize.4. Validate.Identify. This stage consists of two

    major steps. The first is to develop aclear understanding of customerrequirements for the micro level

    design activities, in which the defini-tion of the customer can includeinternal and external customers andother stakeholders. We also take intoconsideration the businesss finan-cial goals, such as development costand schedule. The needs and wantsare collectively known as CTXs orcritical to (whatever the particularneed or want is). These CTXs arethen translated into architecturerequirements, specifications, perfor-mance criteria or other objective

    measures for the activity.The next step is to identify whatresources are needed to achieve therequirements identified in step one.Consider technology, manpower,supplier, process and business con-straints. Then create an effectiveplan to achieve the CTXs. Commondeliverables for this stage include afeasibility study, definition of thecustomer, needs analysis, financialor cost analysis, system operationalrequirements, functional require-ments and advance product plan-ning.Characterize. In this stage we

    82 I J U L Y 2 0 0 2 I W W W . A S Q . O R G

    M

    FRONTIERS OF QUALITY

    Design for Six Sigma

    You need more than standard Six Sigma approaches

    to optimize your product or service developmentby Douglas P. Mader

    tion of traditional quantitative quali-ty tools.

    DFSS opportunities should be cat-egorized by the size of the effort.Macro opportunities involve thedesign and development of anentirely new product or service orthe major redesign of an existingone. Micro opportunities are smallerin scope and likely pertain to theexecution of a subtask within amacro opportunity.

    To address macro opportunities,most organizations develop theirown NPD process based on theseven-step systems engineeringmodel. Modern implementations ofDFSS consist of integrating DFSSdeliverables into the checkpoint cri-

    teria for the macro level NPDprocess, training the managementteam on how to resource and guidethe effort, and using a structuredapproach to the application of DFSSprinciples at the micro level.

    At the macro level, there is noconsistent standard for integratingDFSS into an existing NPD process

    because NPD processes are as wide-ly varied as the products and ser-vices they generate. My company,SigmaPro, works with its clients to

    map their NPD processes anddevelop a custom approach thatuses best practices from the organi-zation and incorporates DFSS deliv-erables into the checkpoint criteria.

    Management training is ad-dressed before technical training

    because many of the issues pertain-ing to performance of the NPDprocess are not technical in nature.Once DFSS criteria have been inte-grated into the NPD process, themanagement team has been fullytrained and the appropriate designprojects or applications have beenselected, we launch technical train-

  • 7/28/2019 Design for Six Sigma Paper 3

    2/3

    If we have several competing con-cepts, we need to determine themeasurement systems that willallow us to test the performance ofthe system for important designattributes. For product design attrib-

    utes, we use statistical measurementcapability methodologies, such asgage repeatability and reproducibili-ty, and risk analysis. A commonmistake DFSS practitioners make,however, is to assume we are onlyreferring to hardware tests in thisimportant step. In fact, subjectivetesting methods such as choice mod-eling, focus groups and customerinterviews are equally as importantas hardware testing.

    For services, we needto find measurementsystems, such as cus-tomer satisfaction, rec-ommend rates, re-purchase rates and per-ceptions of value, thatallow us to measureperformance subjec-tively. Tools such assurveys and processmapping coupled withFMEAs are extremelyuseful. Once we havedetermined the mea-

    surement systems thatwill give us the infor-mation we needregarding the perfor-mance of the design,we implement the con-cept testing processand evaluate the riskwith regard to theCTXs. We can thenmake a decision to pro-ceed or to revisit thedesign concepts.

    Optimize. In thisstage we make sure wehave optimized thedesign with regard tocustomer and businessrequirements. For eachCTX, the designer orteam should identifythe key product orprocess output vari-ables (KPOVs) thatrelate to the desiredperformance. Once theKPOVs are identified,

    the next step is to findthe key process or

    develop, test and validate design con-cepts. Once we have identified theCTXs and determined the requiredlevel of performance, the designer orteam may have several competingsolutions or concepts. Pugh concept

    selection methods and enhancedquality function deployment (EQFD)matrices are often coupled with fail-ure mode and effects analyses(FMEA) and cause and effect matricesto help organize the information per-taining to concept selection. Severalother common tools employed in thisstage include architecture block dia-grams, functional flow diagrams,specification trees, functional hierar-chy diagrams and process mapping.

    Q U A L I T Y P R O G R E S S I J U L Y 2 0 0 2 I 83

    Identify, Characterize,Optimize And ValidateStrategy

    FIGURE 1

    Identify

    Translate high level requirements(CTXs) into objective measures

    Characterize

    Optimize

    Validate

    Identify resources, gaps and risks

    Formulate concept design

    Validate measurement systems

    Assess concept designs

    Evaluate risk to CTXs for all concepts

    For each CTX, identify keyproduct/process output variables (KPOVs)

    Develop transfer functions betweenKPOVs and key input variables (KPIVs)

    Establish realistic performance criteria

    Assess risk to CTXs via KPOVs/KPIVs

    Test and validate solution/design

    Establish controls and action plans

    Assess overall risk to CTXs

    product input variables (KPIVs)that we can control in order to opti-mize the KPOVs.

    To quantify the relationshipbetween KPOVs and their associat-ed KPIVs, we develop transfer func-

    tion models. These models provideinformation on how we can changethe KPIVs to optimize the perfor-mance of the KPOVs and, therefore,the design.

    For the construction of transferfunction models, we have two alter-natives. If we have a physics basedequation, we can use calculus basedmethods to predict the mean, vari-ance and capability for a KPOV

    ba se d on the me an s, va ri an cesand capabilities of the KPIVs.Unfortunately, this method isextremely difficult even for simpleproblems and is rarely used.

    The second method is to buildempirical prediction equationsusing regression or other mathemat-ical methods based on simulation orexperimental results. Empirical pre-diction models allow us to predictthe mean, variance and capability ofa KPOV based on the means, vari-ances and capabilities of the KPIVs.

    Monte Carlo simulation is typical-ly used for hardware scenarios, and

    discrete event simulation is used forservice scenarios. Engineers oftenhave simulation engines that areadept at modeling physical systems,and these tools have been standardpractice for some time. Only recent-ly, with the development of integrat-ed process mapping and simulationtools, have similar methods becomeavailable to service designers.

    After we develop transfer func-tions and optimize the system, weneed to establish realistic perfor-

    mance criteria for the product orprocess. In other words, we establishthe criteria that will be used toassess whether the process or prod-uct fulfills the customer s require-ments. Then we estimate our risks tothe desired performance of the CTXsusing knowledge of the KPIVs,transfer functions and the predictedperformance of the KPOVs.

    A key question to ask is how wellthe system fulfills the customerwants and needs. If we can demon-strate the optimized design fulfills

    customer and business require-ments, we will also want to assess

  • 7/28/2019 Design for Six Sigma Paper 3

    3/3

    whether the design possesses ade-quate quality levels at an appropri-ate cost. Common tools employedin this stage include EQFD, FMEA,cause and effect matrices, statisticaldecision making, statistical toleranc-

    ing, risk analysis, designed experi-ments, simulation and optimization,and probabilistic design.Validate. In this stage we test and

    validate the optimized design. Weconfirm the performance, capability

    and reliability of the product,process or service. The key is tomake sure we have good test sys-tems to validate the design. Some ofthe tests may be subjective innature. Upon validation of the sys-tem through objective and subjec-

    tive testing, the designer or teamshould establish control and actionplans. This involves the extension ofrisk management from the concep-tual design into the production oroperations environment.

    Statistical process control, errorproofing, reliability, maintenancevalidations, sampling, test plans,test coverage, process FMEAs andmeasurement capability are all toolsneeded to complete this stage.

    The design team should then make

    an overall assessment of the risksand potential impact on the CTXs.

    Old tools, new strategyThe majority of the tools used in

    the ICOV methodology have beenemployed for a number of years.Thus, DFSS is a strategy for thedeployment of tools that are notnecessarily new. The power of DFSSis in the organization of the toolsinto a coherent strategy that alignswith the NPD process, not in theindividual tools themselves. This

    structured application of the toolsallows for a much higher rate of

    FRONTIERS OF QUALITY C O N T .

    success when compared to currentapproaches. One of the past difficul-ties in NPD improvement is thatpeople have focused on the applica-tion of discrete tools and not onhow those tools align with and sup-

    port the objectives of the overallNPD process.DFSS is not limited to the design

    and development of products andprocesses. The majority of the tools,concepts and methods in the DFSSapproach can be applied to serviceindustries and processes and toindustrial products and processes.

    BIBL IOGRAPHY

    Beam, W.R., Systems Engineering: Architectureand Design (New York: McGraw-Hill, 1990).

    Chestnut, H.D., Systems Engineering Tools(New York: John Wiley & Sons, 1965).Defense Systems Management College,

    Systems Engineering Fundamentals: SupplementaryText (Fort Belvoir, VA: The Press, 2001).

    Defense Systems Management College,Systems Engineering Management Guide (FortBelvoir, VA: The Press, 1990).

    Ja ck so n, Sc ot t, Systems Engineering forCommercial Aircraft (Brookfield, VT: Ashgate,1997).

    Je nk in s, G. M. , and P. V. Yo ul e, SystemsEngineering: A Unifying Approach in Industry andSociety (London: C.A. Watts, 1971).

    Lacy, J.A., Systems Engineering Management:Achieving Total Quality (New York: McGraw-Hill,1992).

    Sage, A.P., Systems Engineering (New York:John Wiley & Sons, 1992).

    Sage, A.P., and J.E. Armstrong, Introduction toSystems Engineering (New York: John Wiley & Sons,2000).

    DOUGLAS P. MADER is a speaker, seminarleader and certified master instructor for SixSigma and design for Six Sigma (DFSS). He isthe founder and president of SigmaPro Inc., aconsulting firm in Fort Collins, CO, that spe-cializes in integrated deployment of Six Sigma,DFSS and lean systems. He has held leadership

    posi tions in Six Sigma with the Six SigmaAcad emy, Se ag at e, He wl et t- Pack ar d and

    Moto ro la . Ma de r ea rn ed a doct or at e inmechanical engineering from Colorado StateUniversity and is a Senior Member of ASQand the Institute for Industrial Engineers. QP

    The power of DFSS is in the

    organization of the tools into a

    coherent strategy that aligns

    with the NPD process, not the

    individual tools themselves.

    If you would like to comment onthis article, please post your remarks

    on the Quality ProgressDiscussion

    Board on www.asqnet.org, or e-mailthem to [email protected].

    84 I J U L Y 2 0 0 2 I W W W . A S Q . O R G