Upload
oninebuenr
View
111
Download
1
Tags:
Embed Size (px)
DESCRIPTION
IEE 581 Lectures
Citation preview
Home and Consumer Finance Group
Rev. 1.0
This document and all information and expression contained herein are the property of ASU Department of Industrial Engineering, and may not, in whole or in part, be used, duplicated, or disclosed for any purpose without prior written permission of ASU Department of Industrial Engineering. All rights reserved.
ASU Department of Industrial Engineering 2012
Define Phase
Paul Sandell ([email protected])
Geetha Rajavelu ([email protected])
ASU Department of Industrial Engineering 2012 2
Objectives
What is Six Sigma
Pre-Define: Project ideation and Prioritization
Identify the elements of the Define phase
Discuss tollgate elements completed in Define
Discuss some Define tools
ASU Department of Industrial Engineering 2012 3
What is Six Sigma?
Six Sigma
DMAIC Lean
DMADV
Six Sigma is a problem solving and process improvement methodology that helps improve our products, processes, services, and people, by reducing variation and eliminating defects, and waste!
In completing Process Improvement Projects, Six Sigma uses three approaches
DMAIC (Define-Measure-Analyze-Improve-Control) When we have an existing process that is not meeting customer requirements
DMADV (Define-Measure-Analyze-Design-Verify) When we are designing a new process, or completely re-designing an existing process
Lean Principles to reduce waste and accelerate the velocity of a process
ASU Department of Industrial Engineering 2012 4
Start Date:
_________________
Project Charter
Problem Statement
Goal Statement
In/Out of Scope
Team & Time Commitments
Timeline / Milestone
Estimate Financial Benefits
Risks, Constraints & Compliance Issues
Identified
SIPOC
High Level Process Map
Start Date:
_________________
Y & Defect Defined
Performance Spec for Y
Data Collection Plan
Measurement System Validated
Collect data for Y
Process Capability for Y
Improvement Goal for Y
Detailed Process Map
Tollgate Review Date:
________________
Start Date:
________________
Brainstorm all possible Xs
Prioritize list of Xs
ID gaps
Refine benefits estimate
Tollgate Review Date:
_________________
Start Date:
______________
Develop and select solutions
Perform pilot of solutions
Confirm improvements
Confirm prioritized Xs
Map new process
Develop implementation plan
ID controls
Implement improvements
Start Date:
_____________
Process documentation / Controls in place
Measure performance
Confirm sustainable solution
Transfer ownership
Evaluate translation opportunities
ID other improvement opportunities
Project documentation complete
Financial benefits verified and approved
Leveragability
Tollgate Review Date:
_________________
Define Control Improve Analyze Measure
Not Started In Progress Complete
DMAIC Tollgate Checklist
ASU Department of Industrial Engineering 2012 5
Life Before Define
One of the most critical aspects of a successful Six Sigma
deployment is to select the right project(s)
Effective project ideation, prioritization and selection leads to Six Sigma project results
Many companies fail before they even start Define
ASU Department of Industrial Engineering 2012 6
Generate
Ideas Prioritize Launch
2 1 3
The high level process.details to follow!
Project Selection Roadmap
ASU Department of Industrial Engineering 2012 7
Generate
Ideas Prioritize Launch
CTQ flow from Strategic
Plan
Financial Analysis
Performance
Metrics
Things that keep you up at
night (Organic)
Potential project ideas Projects
Project Ideation Methods
ASU Department of Industrial Engineering 2012 8
CTQ Flow Down Process What A process in which strategic goals of the organization are used and a statistical relationship is determined to describe how the strategic goal is improved by completing the project.
How and Who A trained MBB or BB would partner with the key business leaders (Champions) and process owners, to establish the linkage from strategy to project ideas.
How Often Should be completed at least annually, and updated as business strategies change Issues This process can take from week to months to adequately complete!
This is our most essential voice of the customer linkage to the business!
Strategy 1 Strategy 2
Business Need 1 Business Need 2 Business Need 3
Project Idea 1 Project Idea 2 Project Idea 3 Project Idea 4 Project Idea 5
ASU Department of Industrial Engineering 2012 9
Financial Analysis What A process that reviews keys financial indicators for the business to identify project opportunities
How and Who A financial leader would partner with the key business leaders (Champions) and process owners, to establish the linkage from strategy to project ideas. An MBB or BB can be used to
help facilitate the process.
How Often Completed at least annually, and as frequent as quarterly Issues Potential introduction of variation
This is a voice of the business process to generate project ideas
ASU Department of Industrial Engineering 2012 10
Performance to Plan
What A process that reviews metrics of existing performance to the business plan, and develops project ideas based on performance gap to the plan How and Who Process owners and key business leaders review the gaps, primarily during operational reviewsactions (projects) are typically an output of the process How Often Quarterly Issues Potential introduction of variation
Another voice of the business process to generate project ideas
ASU Department of Industrial Engineering 2012 11
Organic Project Path
What A process that uses structured brainstorming to bubble up project ideas at all business levels How and Who Process owners (with MBB or BB assistance as necessary) facilitate their work teams through the process
How Often Quarterlyuntil the process becomes a natural part of the culture Issues Can be great for team buildingdont let the process become a complete whining session
A creativity process, based on business pain points
ASU Department of Industrial Engineering 2012 12
Yes
Six Sigma Project checklist
Key driver of Customer Satisfaction focused
Narrow scope
Available metrics or measurements that can be developed
quickly
Control of the process owned by the Champion
Recurring events
Linked to corporate or business unit objectives
Financial benefits are traceable
Solution unknown
ASU Department of Industrial Engineering 2012 13
How Do We Prioritize?
Ideas Prioritize Launch
Ensure projects are linked to a
companys businesses? With ideas generated through multiple
methods, we are ready to score the ideas against one another.so what is the process?
Ideas
Become
Projects
ASU Department of Industrial Engineering 2012 14
Prioritization of Projects
Now that we have a list of projects (project pipeline) how do we
decide to do which ones first??
Prioritization A system is needed to gauge the relative customer, business, team
member and time impact of each project idea.
Best practice organizations develop filters or criteria to complete this assessment, with a numerical importance value attached to the criteria.
Panning for the gold nuggets in our business!
ASU Department of Industrial Engineering 2012 15
Proposed criteria will be used to prioritize identified projects
Criteria Description Worse Better
Potential
Impact on
Employee
Satisfaction
Relative impact to employee satisfaction
Potential Impact
on Customer
Metrics
Relative benefits and impact to key business drivers, when compared customer requirements
1 3 9
No change Improve < 20% Improve > 20%
Time to
Implement Expected time required to fully implement the project
Savings or
Revenue
Approximate savings or revenue obtained
Potential
Impact on
Experience
Relative impact to customer experience
Weight
20%
30%
10%
20%
20%
Cu
sto
mer
F
inan
cial
E
mp
loye
e P
roce
ss
1 3 9
No change Improve < 20% Improve > 20%
1 3 9
< $50k > $50k < $200k > $200k
1 3 9
No change Improve < 20% Improve > 20%
1 3 9
9+ months 3-9 months 0-3 months
ASU Department of Industrial Engineering 2012 16
Project Launch
Ideas Prioritize Launch
Identification of Project Champion
Champion drafts charter
. Champion selects belt and attend pre-training
Team members are identified
Pre-Launch Checklist
Output is Project Kickoff!
ASU Department of Industrial Engineering 2012 17
Black Belt vs Green Belt Project
Black Belt Project
Full time BB resource
Project scope is broad
Typically more complex
Large business impact
3-6 months to complete
Green Belt Project
Part time GB resource
Project scope is narrow
Typically less complex
Localized business impact
3 months to complete
Champion supports the determination of BB Vs GB project!
ASU Department of Industrial Engineering 2012 18
Review of the Process
Lets take a few minutes to review our process
1. Business strategy and operation plans established
2. Ideas generated based on key processes identified
3. Prioritization completed
4. Business drafts charters and determines project methodology
5. Projects launched
Our projects are established, and we are on our way to Define!
ASU Department of Industrial Engineering 2012 19
What Happens In Define?
In the Define phase of DMAIC there are three
key elements that we seek to understand
Clarify the business problem (i.e. opportunity)
Identify and validate customer requirements
Begin documentation of the process
ASU Department of Industrial Engineering 2012 20
Define for Design Approach (DMADV)
Objectives
Define a vision and project strategy to achieve the vision
Review your Design Charter
Opportunity, goal, scope
Validated by leadership
Identify initial design requirements based on Voice of the Customer
Document the process where design will focus
ASU Department of Industrial Engineering 2012 21
Define: DMAIC Vs DMADV
Define in DMAIC
Define the business problem/goals and customer
deliverables
Documents the existing process and existing customer needs
Define in DMADV
Define the business problem/goals and customer
deliverables
Determines unknown customer needs and links to future design
ASU Department of Industrial Engineering 2012 22
Charter Elements Buttoned Down
We have (definitely) completed a draft charter before the belt
attends class, and we want to refine and clarify as appropriate
Problem statement Goal statement Scope Team & time commitments Project plan (timeline) Estimated benefits
Define: Measure: Analyze: Improve: Control:
Start End
Start End Start Revised
One time Impact Annual One time Impact Annual
0.00 0.00 0.00 0.00 0.00 0.00
Other
Other
Other
Defect Definition
Revenue Growth
Other
Materials
Other
Other
PROJECT OVERVIEWProblem Statement:
Identify Business Strategic Linkages & Secondary Business Impacts:
Project Goal:
Project Milestones
Project Title:
Project Leader: Champion:
BENEFITS SUMMARY ($000)Indirect (Soft) Type B
Productivity
Cost Avoidance
Benefits Category
Direct (Hard) Type A
PROJECT DETAILSIn / Out of Scope
Team Members & Resources
Opportunity Definition
Sign-offs: Open Close
Coaching MBB: _____________________ __________________________
Champion: ____________________ __________________________
Business Leader: ____________________ __________________________
Finance Rep. ____________________
Other
One Wells Fargo Customer Impact
You know me
TOTAL BENEFITS:
(For example: $ per resource requirements)
Other
Other Key Project Measurements:
Overall Benefits: 0.00
BaselineWho is the Customer & What is the Customer Impact?:
Process Owner:
Idea Document Completed?:
Six Sigma Project Charter
MBB / Coach:
Location: Project Start Date:
Project End Date:
Planned Completion Dates, by project phase:
PROJECT MILESTONES
Project Type:Project ID No.:
Process:
Goal
`
Project Metrics (can include secondary metrics)
Rev. 2.2
ASU Department of Industrial Engineering 2012 23
Lets Look At A Real Green Belt Project
Problem statement: The current lack of procedures for completing emergency installs to production systems creates a risk of user and customer
impacts, down-time and inadequate communication.
Project objective / goal: Design and implement an emergency installs process that will allow for quick and accurate resolution of high severity
production issues and create objective tracking of results.
ASU Department of Industrial Engineering 2012 24
Takeaway
What is your assessment of this Problem and Goal? Why?
ASU Department of Industrial Engineering 2012 25
Project Plan
The roadmap on the journeywe want all belts to complete a project plan!
Six Sigma Project PlanScheduled
Start
Scheduled
Finish
Actual
Start
Actual
Finish
Estimated
Duration
Tool or
Task Define Day 1 Day 35
Task Champion Identifies business oportunity linked to Business strategy On going On going
Tool Champion drafts rough cut charter Day 1 Day 2
Task Champion selects belt candidate Day 2 Day 2
Task Belt is assigned to training wave Day 2 Day 2
Task Belt assigned MBB coach Day 3 Day 3
Tool Champion/Belt review and modify charter Day 4 Day 9
Task Problem statement definition complete Day 4 Day 4
Task Project goal definition complete Day 4 Day 4
Task Project defect definition complete Day 4 Day 4
Task Project scope complete Day 4 Day 4
Task Key output metric and customer benefits complete Day 4 Day 4
Task Project benefits estimated Day 4 Day 9
Task Review benefits estimate with finance Day 5 Day 5
Task Finalize benefits and obtain finance signoff Day 9 Day 9
Task Champion and Belt determine project resources Day 5 Day 9
Task Final project signoff with Champion and MBB coach Day 10 Day 10
Task Meeting schedule determined with MBB coach Day 10 Day 10
Tool Project plan complete Day 10 Day 15
Task Kickoff meeting held with team and customer Day 11 Day 11
Task Roles clarified Day 11 Day 11
Tool Issue/Action/Risk log initiated Day 11 Day 11
Task Customer requirements obtained Day 12 Day 15
Tool SIPOC completed Day 15 Day 15
Tool Survey completed Day 15 Day 35
Tool High level "as is" process map complete Day 16 Day 16
Measure Day 11 Day 44
ASU Department of Industrial Engineering 2012 26
Project Benefits
A critical element in definehelps clarify business value! There are two types of project benefits. Customer satisfaction (includes internal customersteam member satisfaction)
Financial
A Six Sigma project should have at
least one if not both of these benefits!
Both benefit types are the result of improved PROCESSES
ASU Department of Industrial Engineering 2012 27
Project Benefits
Customer Satisfaction A measurable result of the belt project, would be higher levels of satisfaction.
Financial
Productivity Cost or Growth
Cost Avoidance Materials
Physical or vendor costs
Best practice organizations measure annualized benefits
ASU Department of Industrial Engineering 2012 28
Risks, Constraints & Compliance Issues
Risk Impact Probability Risk Score Mitigation Actions
Six Sigma Project: Risk, Constraints, Compliance Issues
Why do we discuss this as part of Define??
ASU Department of Industrial Engineering 2012 29
Document the Process
Two critical tools that belts use to document the process (and
which Leaders should understand are. SIPOC Process Map (high level)
Lets take a look at both!
ASU Department of Industrial Engineering 2012 30
A Process Is Defined As...
...A series of tasks or activities whereby one thing (the
input) is changed or used to create something else (the
output)
ASU Department of Industrial Engineering 2012 31
The SIPOC is a tool that documents a process from suppliers to customers. Once completed, it is used to:
Identify and balance competing customer requirements.
Aid in identification of data collection needs.
See process connected to the Customer
Avoids getting stuck in detail
A simplified view of entire process visible at a glance
Help provide scoping direction on projects
SIPOC Defined
Suppliers Inputs - Process Outputs - Customers
ASU Department of Industrial Engineering 2012 32
Steps to Diagram a SIPOC
1. Identify the Process to be diagrammed and name it
Write that in the Process Name
Complete other information at top of form
2. Define the Outputs and Inputs (boundaries):
Start at the END: Customer(s) and key Output(s)
Supplier(s) and key Input(s)
3. Clarify the Requirements (optional, but recommended)
What are key features/characteristics of the Output for each Customer?
4. Establish ~2-5 high-level Process Steps
Brainstorm major process activities on sticky notes
Group or organize activities into similar categories or
major steps in the process (Suggestion: use Affinity method)
Place major steps in most appropriate order
ASU Department of Industrial Engineering 2012 33
SIPOC Form
ASU Department of Industrial Engineering 2012 34
SIPOC Example
Project Title: Project Champion:
Process Owner: Project Belt:
Core Process: Project Number:
SUPPLIERS INPUTS OUPUTS
(Providers of the
required resources)
(Resources required by
the process)
(Deliverables from the
process)
Requirements Requirements
Knowledgeable Baked Cookies Soft/chewy Kids
Spouse
Spouse
Spouse
Soft/chewy
Warm
Clean
Baked Cookies
Baked Cookies
Messy Kitchen
Avaliable
Avalible/quality
Available
Available/Working
Cook
Recipe/Book
Ingredients
Utinsils
Oven
Food store
Retail store
Appliance store
Family
Amazon
SIPOCCookie
Betty Crocker
Cookie Baking
Martha Stewart
Rachael Ray
1
(Top level description of activity)(anyone who receives a deliverable from
the process)
PROCESS CUSTOMERS
Timer Dings
Bake Cookie
Dough
Obtain
Ingredients
1
1
9
5
8 6 7 4 32
ASU Department of Industrial Engineering 2012 35
Graphical representation of a process
Identifies Key Process Input Variables (KPIVs, also called your little xs)
Identifies Key Process Output Variables (KPOVs, also called your little ys)
First process map should be as is
Ensure process is walked. A business process can be walked, by representing information transfer and modification points. Team should not
assume they know the process well enoughwalk it.
The result should encompass a process map that identifies KPIVs and KPOVs. Critical KPOVs should be linked to customer CTQs
Process map can have other information identified on it as well as information the team feels is appropriate (ie. Data collection points)
Take advantage of tribal knowledge held by those who work the process
What is a Process Map?
ASU Department of Industrial Engineering 2012 36
High Level Process Map
The high level process map builds upon our SIPOC by seeking to show the primary sequence of events in the process
A high level go to ASU process
ASU Department of Industrial Engineering 2012 37
A graphic representation of process that details decision points, lists
and classifies KPIVs (little xs) and lists KPOVs (little ys)
Detailed Process Mapping
Rep answers
phone
Rep greets
customer
Rep determines
product need
Cust identify need
date
Rep obtains
customer info and
amount
Rep obtains
internal
information
Rep determine
terms
Rep verifies
information
Rep completes
request worksheet
Rep inputs order
entry info
Rep prints order
confirmation
Rep determines
ship dateRep reviews order
Rep faxes
confirmation to
customer
Rep verifies
manufacturing
receipt
Phone - SOP
CSR - NGreeting - SOP
CSR - N
Customer - N
Product Infomation - C
CSR - N
Customer - N
Answred phone Customer greeted Prod. need obtained Date obtained Cust info obtained Internal info obtained Terms completed
Info validated Completed worksheet Order enteredConfirmation printed Ship date Order reviewed Confirmation faxed
Receit verified
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
Term info - N
Fax machine- SOP
CSR - N
Customer - N
Order - C
CSR - N
Customer - N
Order - C
CSR - N
Customer - N
Order - C
Printer - SOP
Customer - N
Order - C
CSR - N
CSR - N
Customer - N
Order - C
System - SOP
CSR - N
Customer - N
Worksheet - C
CSR - N
Order - C
CSR - N
Receipt - SOP
Inputs
Outputs
ASU Department of Industrial Engineering 2012 38
Define Completed
With these elements completed, the Define phase is essentially complete.why do we say essentially?
Project Charter
Problem Statement
Goal Statement
In/Out of Scope
Team & Time Commitments
Timeline / Milestone
Estimate Financial Benefits
Risks, Constraints & Compliance Issues Identified
SIPOC
High Level Process Map
ASU Department of Industrial Engineering 2012 39
Define
"What we think..."
Purpose: Properly define project in terms of Project
purpose, scope, objective & customer CTQ's are
stated. Processes or product to be improved
identified.
Key Outputs:
Customer / Business CTQ's Project Charter
SIPOC Process Map
To Measure
Corporate Vision &
Objectives Set
Dept. ADept. A
Dept. BDept. B
Dept. CDept. C
Dept. DDept. D
Dept. EDept. E
Cycle TimeCycle Time
2 days2 days 4 days4 days 3 days3 days 3 days3 days 2 days2 days
1122
33
44
10 6 8
Project
Home Mortgage Defects 9 3 9 180
0
0
0
CT
Q
Cu
sto
me
r Im
pa
ct
Fin
an
cia
l Im
pa
ct
Em
plo
ye
e Im
pa
ct
CT
Q
Total
CT
Q
Cause and Effect MatrixProject Prioritization
Business Groups Establish Objectives
supporting Corporate Objectives Belt Candidate Selected
Projects Selected & Prioritized
Champion or Champion & Belt
complete Project Charter. Customer &
Business CTQ's become part of
project.
High Level Process Flow Diagram to
begin understanding the process
Customer Requirements & CTQ's
determined
Complete SIPOC which helps to
scope project & ID measurement
points as well as customers
process inputs & outputs
Detailed Process Map with
inputs and outputs identified
Answer phone Greet customerDetermine product
needIdentify need date
Obtain customer
info and amount
Obtain internal
informationDetermine terms
Verify informationComplete request
worksheet
Input order entry
info
Print order
confirmation
Determine ship
dateReview order
Fax confirmation
to customer
Verify
manufacturing
receipt
Phone - SOP
CSR - NGreeting - SOP
CSR - N
Customer - N
Product Infomation - C
CSR - N
Customer - N
Answred phone Customer greeted Prod. need obtained Date obtained Cust info obtained Internal info obtained Terms completed
Info validatedCompleted worksheetOrder enteredConfirmation printedShip dateOrder reviewedConfirmation faxed
Receit verified
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
System - SOP
CSR - N
Customer - N
Term info - N
Fax machine- SOP
CSR - N
Customer - N
Order - C
CSR - N
Customer - N
Order - C
CSR - N
Customer - N
Order - C
Printer - SOP
Customer - N
Order - C
CSR - N
CSR - N
Customer - N
Order - C
System - SOP
CSR - N
Customer - N
Worksheet - C
CSR - N
Order - C
CSR - N
Receipt - SOP
ASU Department of Industrial Engineering 2012 40
Summary
Reviewed and discussed the elements in the Define Phase
Demonstrated appropriate applications of the Six Sigma tools in the Define Phase
ASU Department of Industrial Engineering 2012 41
Appendix Completed Charter
Home and Consumer Finance Group
Rev. 1.0
This document and all information and expression contained herein are the property of ASU Department of Industrial Engineering, and may not, in whole or in part, be used, duplicated, or disclosed for any purpose without prior written permission of ASU Department of Industrial Engineering. All rights reserved.
ASU Department of Industrial Engineering 2012
Six Sigma Define
2012
This document and all information and expression contained herein are the property of ASU Department of Industrial Engineering, and may not, in whole or in part, be used, duplicated, or disclosed for any purpose without prior written permission of ASU Department of Industrial Engineering. All rights reserved.
ASU Department of Industrial Engineering 2004
IEE 581Six Sigma Methodology
DMAIC Measure Phase
Dr. Harry ShahPresident and Master Black BeltBusiness Excellence Consulting [email protected]
ASU Department of Industrial Engineering 2004
DMAIC - Process Improvement Roadmap
What is important?
How are we doing?
What is wrong?
What needs to be done?
How do we guarantee
performance?
1.0 Define
Opportunities
2.0Measure
Performance
3.0 Analyze
Opportunity
4.0 Improve
Performance
5.0Control
Performance
ASU Department of Industrial Engineering 2004
Measure Performance
Key Deliverables Input, Process, and
Output Indicators Operational
Definitions Data Collection
Formats and Sampling Plans
Measurement System Capability
Baseline Performance Metrics
Process Capability DPMO PLT PCE Yield/Scrap Others
Productive Team Atmosphere
Inputs Team Charter
Business case Goal statement Project scope Project plan Team roles and responsibilities
Prepared Team Critical Customer Requirements Process Maps Quick Win Opportunities
2.0 Measure Performance Determine What to
Measure Manage Measurement Evaluate Measurement
System Determine Process
Performance
ASU Department of Industrial Engineering 2004
Determine What to Measure
ASU Department of Industrial Engineering 2004
Determine What to Measure
SIPOC Diagram
Common Elements to All Processes
Supplier Input Process Output Customer
ASU Department of Industrial Engineering 2004
Case Study - Coffee Example
A fast food restaurant conducted an annual customer survey. There was an overwhelming response from customers. A good percentage of the responses were favorable. The customers liked their service and food. Other customers complained that the coffee served by the restaurant was not consistent in taste. As a result some customers stopped patronizing the restaurant.Owners son, is enrolled in Six Sigma Methodology course at ASU. He decided to tackle the problem. The process consisted of Coffee Brewing.
ASU Department of Industrial Engineering 2004
SIPOC Diagram
ProcessSupplier Input Output Customer
Case Study - Coffee Example
Coffee Mfg
Filter MfgWater Supplier
Coffee
Filter
Water
Brewed Coffee
Patron
ASU Department of Industrial Engineering 2004
Determine What to Measure
SIPOC Diagram
Common Elements to All Processes
Supplier Input Process Output Customer
Input Indicators Process Indicators Output Indicators
ASU Department of Industrial Engineering 2004
Determine What to Measure
Input IndicatorsMeasures that evaluate the degree to which the inputs to a process, provided by supplier, are consistent with what process needs to efficiently and effectively convert into customer satisfying outputs.
ASU Department of Industrial Engineering 2004
Determine What to Measure
Process IndicatorsMeasures that evaluate the effectiveness, efficiency, and quality of the steps and activities used to convert inputs into customer satisfying outputs.
ASU Department of Industrial Engineering 2004
Determine What to Measure
Output IndicatorsMeasures that evaluate the effectiveness of the output.
ASU Department of Industrial Engineering 2004
Case Study - Coffee Example
Input Indicators
Coffee Manufacturer
Filter Manufacturer
Type of Water (Tap vs Bottle)
Process Indicators
Amount of Coffee
Amount of Water
Age of Coffee
Output Indicators
Coffee Temp.
Coffee Color
Coffee Flavor
Customer Satisfaction Index (Taste)
ASU Department of Industrial Engineering 2004
Determine What to Measure
InputIndicators
ProcessIndicators
OutputIndicators
X Y
Y = f(X)
ASU Department of Industrial Engineering 2004
Tools Functional Process Map Brainstorming (Cause & Effect Diagram) Failure Modes and Effects Analysis (FMEA) Cause and Effect Matrix
Selecting and Prioritizing Input, Process and Output Indicators
ASU Department of Industrial Engineering 2004
EmptyCoffee Pot
Put Coffee Filter
Put Coffeein Filter
Fill WaterJug
Turn CoffeeMaker On
Pour Waterin Coffee Maker
CoffeeReady
ReceiveCustomer
Order
Fill Coffeein Cup
ServeCustomer
GetPayment
Coffee Maker Sales Associate
Functional Process Map - Coffee Example
ASU Department of Industrial Engineering 2004
MEOPLE MACHINE
VARIATION INCOFFEE TASTE
MATERIALMETHOD
Amount of Coffee
Age of Coffee
Caffeine Content
Amount of Water
Water Type
Coffee Mfg
Training
Cause & Effect Diagram - Coffee Example
Age of Brewed Coffee
Heater
ASU Department of Industrial Engineering 2004
Failure Modes and Effects Analysis
Identify potential failure modes, determine their effect on the operation of the product, and identify actions to mitigate the failures.
Utilize cross functional team Improve product/process reliability & quality Emphasizes problem prevention Increase customer satisfaction
www.npd-solutions.com/fmea.html
ASU Department of Industrial Engineering 2004
Failure Modes & Effects Analysis Process/Product: FMEA Date: (original)
FMEA Team: (Revised) Black Belt: Page: of
Process Actions Results
I
t
e
m
P
r
o
c
e
s
s
S
t
e
p
s
P
o
t
e
n
t
i
a
l
F
a
i
l
u
r
e
M
o
d
e
P
o
t
e
n
t
i
a
l
E
f
f
e
c
t
s
o
f
F
a
i
l
u
r
e
S
e
v
e
r
i
t
y
P
o
t
e
n
t
i
a
l
C
a
u
s
e
(
s
)
o
f
F
a
i
l
u
r
e
O
c
c
u
r
r
e
n
c
e
C
u
r
r
e
n
t
C
o
n
t
r
o
l
s
D
e
t
e
c
t
i
o
n
R
i
s
k
P
r
i
o
r
i
t
y
N
u
m
b
e
r
R
e
c
o
m
m
e
n
d
e
d
A
c
t
i
o
n
R
e
s
p
o
n
s
i
b
i
l
i
t
y
a
n
d
T
a
r
g
e
t
C
o
m
p
l
e
t
i
o
n
D
a
t
e
A
c
t
i
o
n
T
a
k
e
n
S
e
v
e
r
i
t
y
O
c
c
u
r
r
e
n
c
e
D
e
t
e
c
t
i
o
n
R
i
s
k
P
r
i
o
r
i
t
y
N
u
m
b
e
r
Total Risk Priority: Resulting Risk Priority
ASU Department of Industrial Engineering 2004
ASU Department of Industrial Engineering 2004
Cause and Effect Matrix
Helps to prioritize key input and process indicators (Xs) by evaluating the strength of their relationship to output indicators (Ys)
Useful when no data exists Effective in team consensus environment
ASU Department of Industrial Engineering 2004
Cause and Effect Matrix
ASU Department of Industrial Engineering 2004
Manage Measurement
ASU Department of Industrial Engineering 2004
Manage Measurement
Develop an Operational Definition Provides everybody with the same meaning Contains the What, How and Who Adds consistency and reliability to data collection
ASU Department of Industrial Engineering 2004
Example: Operational Definition
One of the key output indicators for coffee example is Temperature of the coffee
John (owners son) decides to implement a step to measure coffee temperature.
How should John write an Operational Definition to measure temperature?
ASU Department of Industrial Engineering 2004
Example: Operational Definition
When coffee is ready measure temperature of coffee.
Is this a good Operational Definition?
As soon as coffee is ready, pour cup of coffee in a plastic cup. Put thermometer in coffee for 30 sec. Read temperature in oF. Record date, time and temperature in a log book.
ASU Department of Industrial Engineering 2004
Example: Operational Definition
XYZ Financials company provides car loans to customers. A recent customer survey indicates customers are unhappy about the time company takes to process their loan applications. CEO asks a Black Belt to determine the average cycle time to process a loan application.
All loan application are received by FAX. The approval/rejection letter is sent to customer via Fax.
Black Belt decides to collect data over one month. Once application has been processed, a bank employee will determine cycle time.
How should Black Belt write an Operational Definition to measure cycle time of loan application?
ASU Department of Industrial Engineering 2004
Example: Operational Definition
Measure Cycle Time for all loan applications processed over one month.
Is this a good Operational Definition?
Collect data from all applications received by fax between Sep 1, 2005 - Sep 30, 2005. The response time will be determined by the date and time of the fax received (as shown on the faxed application), to the time the approval or rejection letter is faxed to the applicant (as shown on the fax log).
ASU Department of Industrial Engineering 2004
Manage Measurement
Develop a Measurement Plan Sample size, frequency etc. Type of data Continuous/Variable Discrete/Attribute (Ordinal, Nominal)
Data collection log sheets Treat as a process!
Collect Data
Visually Examine Data
ASU Department of Industrial Engineering 2004
Sample Data Measurement Plan Form
Performance Measure
Operational Definition
Data Source and Location
Sample Size
Who Will Collect the
Data
When Will the Data Be
Collected
How Will the Data Be
Collected
Other Data that Should Be
Collected at the Same Time
How will the data be used? How will the data be displayed?
Examples: Identification of Largest Contributors Identifying if Data is Normally Distributed Identifying Sigma Level and Variation Root Cause Analysis Correlation Analysis
Examples: Pareto Chart Histogram Control Chart Scatter Diagrams
ASU Department of Industrial Engineering 2004
Collect Data
First: Evaluate the measurement system
Then: Follow the plan note any deviations from the plan. Be consistent avoid bias. Observe data collection. Collect data on a pilot scale (optional).
ASU Department of Industrial Engineering 2004
The data collected will only be as good as the collection system itself. In order to assure timely and accurate data, the collection method should be simple to use and understand.
All data can be collected manually or automatically. Automatic data collection assures accurate and timely data, and removes the burden of collection from the operator of the process. But, it can be very expensive to set up. It usually involves computer programming and/or hardware.
Obtaining the Measurements
ASU Department of Industrial Engineering 2004
Histogram Box plot Trend chart Probability plot Scatter plot etc.
Visually Examine Data
ASU Department of Industrial Engineering 2004
Evaluate Measurement System
ASU Department of Industrial Engineering 2004
Measurement Systems Analysis (MSA)
A process to evaluate the factors that effect the quality of measurements Measuring/metrology tool or gauge Operator Procedure or method Environment
Must be performed before collecting data
ASU Department of Industrial Engineering 2004
Why should Measurement Systems be evaluated?
MSA for Continuous Data
2 2 2Process Measurement Total + =
Process Measure
3.173.802.933.393.53
ASU Department of Industrial Engineering 2004
Process = 3Process = 0.333
Measurement = 0.4Measurement = 0.0167
Total = 3.4Total = 0.3334
+
=
ASU Department of Industrial Engineering 2004
Measurement Systems Properties for Continuous Data Discrimination Accuracy (Bias) Stability Linearity Gauge Capability (GR&R)
MSA for Continuous Data
ASU Department of Industrial Engineering 2004
Property: Discrimination
Capability of the measurement system to detect and faithfully indicate even small changes of the measured characteristic
1 2 3 4 5
Good Discrimination
1 2 3 4 5
Poor Discrimination
ASU Department of Industrial Engineering 2004
Discrimination contd.
A general Rule of Thumb:A measurement tool will have adequate discrimination if the measurement unit is at most one-tenth of the six sigma spread of the total process variation,
Measurement Unit < (6*Total)/10
ASU Department of Industrial Engineering 2004
Property: Accuracy or Bias
Bias is the difference between the observed average and the reference value
Accurate Not Accurate
ASU Department of Industrial Engineering 2004
Obs Avg = 101.63Ref Value = 100 Bias
Accuracy or Bias contd.
ASU Department of Industrial Engineering 2004
The distribution of the measurements should be constant over timeAverageStandard deviation
No drifts, sudden shifts, cycles, etc.
Evaluated with control charts of standard/golden unit(s) measurementsXbar/R, Xbar/S, X/MR, etc.
Property: Stability
ASU Department of Industrial Engineering 2004
Stable Gage
Time 1 Time 2
Not Stable Gage
Stability contd
ASU Department of Industrial Engineering 2004
Stability contd 3 Reference Units on 1 Metrology Tool
ASU Department of Industrial Engineering 2004
50403020100
4250
4240
4230
Reading No.
P
o
l
y
T
h
i
c
k
n
e
s
s
(
A
n
g
s
t
r
o
m
s
)
6/9/xx
6/22/xx
7/11/xx
Stability -- Example
Trend chart for polysilicon thickness measurements in a Chemical Vapor Deposition system.
On 6/22, something apparently happened to the process.
The change on 6/22 was traced to a faulty measurement tool.
ASU Department of Industrial Engineering 2004
Property: Linearity Linearity is the difference in the bias values through the
expected operating range of the gauge
Good Linearity
Not Good Linearity
Low High Range of Operation
ASU Department of Industrial Engineering 2004
Bias and Linearity Example
(File: gauge study.mtw)
ASU Department of Industrial Engineering 2004
Property: Gauge Capability (GR&R)
Gauge Capability is made up of two sources of variation or components Repeatability & Reproducibility
2 2 2Repeatability Reproducibility Measurement + =
2 2 2 2Repeatability Reproducibility Process Total + + =
ASU Department of Industrial Engineering 2004
Repeatability
The inherent variability of the measurement system.
The variation that results when repeated measurements are made of the same parameter under as absolutely identical conditions as possible:
same operator. same set up procedure. same test unit. same environmental conditions. during a short interval of time.
ASU Department of Industrial Engineering 2004
Repeatability
True Value
Mean
Poor RepeatabilityGood Repeatability
Mean
6 6
ASU Department of Industrial Engineering 2004
2Measurement = 2Repeatability + 2Reproducibility
Reproducibility The variation that results when different conditions are used to
make the measurement:
different operators. different set up procedures, maintenance procedures, etc. different parts. different environmental conditions.
During a longer period of time.
ASU Department of Industrial Engineering 2004
Reproducibility
True ValueGood
Reproducibility
Poor Reproducibility
Operator 1 Operator 2 Operator 3 Operator 2 Operator 3Operator 1
ASU Department of Industrial Engineering 2004
Gauge Capability Metrics
Measurement
Total
% R&R 100 =
Measurement6% P/T *100USL - LSL=
ASU Department of Industrial Engineering 2004
Requirements for Gauge Capability Metrics
Guidelines for %R&R and %P/T:Under 10% Acceptable10% - 30% May be AcceptableOver 30% Not Acceptable
To find %R&R and %P/T we must estimate Measurement and Total
ASU Department of Industrial Engineering 2004
Example: ANOVA Method (File: gauge study.mtw)
3 Operators, same 10 Parts, 2 Readings/Part Operators & Parts are crossed USL = 2 and LSL = 1
Gage R&R %Contribution
Source VarComp (of VarComp)Total Gage R&R 0.0012892 11.44Repeatability 0.0004033 3.58 ErrorReproducibility 0.0008858 7.86
Operator 0.0002584 2.29Operator*Part 0.0006274 5.57
Part-To-Part 0.0099772 88.56 ProcessTotal Variation 0.0112664 100.00
ASU Department of Industrial Engineering 2004
ANOVA Method contd - Minitab Output
Study Var %Study Var %ToleranceSource StdDev (SD) (6 * SD) (%SV) (SV/Toler)Total Gage R&R 0.035905 0.215430 33.83 21.54Repeatability 0.020083 0.120499 18.92 12.05Reproducibility 0.029763 0.178578 28.04 17.86
Operator 0.016076 0.096454 15.15 9.65Operator*Part 0.025048 0.150289 23.60 15.03
Part-To-Part 0.099886 0.599316 94.10 59.93Total Variation 0.106143 0.636859 100.00 63.69
Number of Distinct Categories = 3
2Measurement Measurement Total0.00129; 0.036; 0.106 = = =
%R&R = 33.83% and %P/T = 21.54%
ASU Department of Industrial Engineering 2004
Gauge Capability Nested, Mixed and Other Models
Crossed Factor B is crossed with Factor A if the levels of B are the same for each level of A Example: In an MSA study, 3 operators measure the same 10
parts 3 times. Operator is Factor A, Part is Factor B. B is crossed with A.
Operator 1
Part 1 Part 2 Part 10
Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3
Operator 3
Part 1 Part 2 Part 10
Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3
Operator 2
Part 1 Part 2 Part 10
Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3
ASU Department of Industrial Engineering 2004
Gauge Capability Nested, Mixed and Other Models Nested Factor B is nested within Factor A if the levels of B are
different for each level of A Example: In an MSA study, 3 operators measure 10 different
parts 3 times. Operator is Factor A, Part is Factor B. B is nested within or under A.
Operator 1
Part 1 Part 2 Part 10
Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3
Operator 2
Part 11 Part 12 Part 20
Rpt 1 Rpt 2 Rpt 3
Operator 3
Part 21 Part 22 Part 30
Rpt 1 Rpt 2 Rpt 3 Rpt 1 Rpt 2 Rpt 3
ASU Department of Industrial Engineering 2004
Additional Model Examples
Six operators were randomly chosen for an MSA study. Each operator had four different instruments to measure with, and the instruments used by one operator were different than the instruments used by another operator. There were nine different parts measured by each instrument. Each part was measured three times.
What if the operators had used the same four instruments?
ASU Department of Industrial Engineering 2004
References
Montgomery & Runger: Gauge Capability & Designed Experiments Part I: Basic Methods. Quality Engineering (1993-94); 6(1), pp 115-135
Montgomery & Runger: Gauge Capability & Designed Experiments Part II: Experimental Design Models & Variance Comp. Estimation. Quality Engineering (1993-94); 6(2), pp 289-305
ASU Department of Industrial Engineering 2004
MSA for Attribute Data
Binomial results: Good/Bad, Conforming/Nonconforming, Red/Not Red, etc.
Use a minimum of 10 known good items and 10 defective items
Use 2-3 Operators or Appraisers Have each Appraiser inspect or evaluate each unit 2-
3 times Analyze as Attribute Agreement Analysis
Example
(File: ATTR-GAGE STUDY.mtw)
ASU Department of Industrial Engineering 2004
MSA for Attribute Data - ExampleWithin Appraisers
Assessment Agreement
Appraiser # Inspected # Matched Percent 95 % CIFred 20 20 100.00 (86.09, 100.00)Lee 20 18 90.00 (68.30, 98.77)
# Matched: Appraiser agrees with him/herself across trials.
Fleiss' Kappa Statistics
Appraiser Response Kappa SE Kappa Z P(vs > 0)Fred G 1.0000 0.223607 4.47214 0.0000
NG 1.0000 0.223607 4.47214 0.0000Lee G 0.6875 0.223607 3.07459 0.0011
NG 0.6875 0.223607 3.07459 0.0011
Each Appraiser vs Standard
Assessment Agreement
Appraiser # Inspected # Matched Percent 95 % CIFred 20 20 100.00 (86.09, 100.00)Lee 20 17 85.00 (62.11, 96.79)
# Matched: Appraiser's assessment across trials agrees with the known standard.
Assessment Disagreement
Appraiser # NG / G Percent # G / NG Percent # Mixed PercentFred 0 0.00 0 0.00 0 0.00Lee 1 5.56 0 0.00 2 10.00
# NG / G: Assessments across trials = NG / standard = G.# G / NG: Assessments across trials = G / standard = NG.# Mixed: Assessments across trials are not identical.
Fleiss' Kappa Statistics
Appraiser Response Kappa SE Kappa Z P(vs > 0)Fred G 1.00000 0.158114 6.32456 0.0000
NG 1.00000 0.158114 6.32456 0.0000Lee G 0.60784 0.158114 3.84434 0.0001
NG 0.60784 0.158114 3.84434 0.0001
Ho: k=0Ha: k>0
ASU Department of Industrial Engineering 2004
MSA for Attribute Data - ExampleBetween Appraisers
Assessment Agreement
# Inspected # Matched Percent 95 % CI20 17 85.00 (62.11, 96.79)
# Matched: All appraisers' assessments agree with each other.
Fleiss' Kappa Statistics
Response Kappa SE Kappa Z P(vs > 0)G 0.673203 0.0912871 7.37457 0.0000NG 0.673203 0.0912871 7.37457 0.0000
All Appraisers vs Standard
Assessment Agreement
# Inspected # Matched Percent 95 % CI20 17 85.00 (62.11, 96.79)
# Matched: All appraisers' assessments agree with the known standard.
Fleiss' Kappa Statistics
Response Kappa SE Kappa Z P(vs > 0)G 0.803922 0.111803 7.19049 0.0000NG 0.803922 0.111803 7.19049 0.0000
Ho: k=0Ha: k>0
ASU Department of Industrial Engineering 2004
Appraiser
P
e
r
c
e
n
t
LeeFred
100
95
90
85
80
75
70
65
95.0% C IPercent
Appraiser
P
e
r
c
e
n
t
LeeFred
100
95
90
85
80
75
70
65
95.0% C IPercent
Date of study: Reported by:Name of product:Misc:
Assessment Agreement
Within Appraisers Appraiser vs Standard
MSA for Attribute Data - Example
ASU Department of Industrial Engineering 2004
Determine Process Performance
ASU Department of Industrial Engineering 2004
Determine Process Performance
Document baseline performance Provide direction to the project Compare before performance to after
ASU Department of Industrial Engineering 2004
Determine Process Performance
Process Capability Indices For continuous data: Cp, Cpk, Cpm For discrete data: Defect Per Million
Opportunities (DPMO) Process Lead Time (PLT) Process Cycle Efficiency (PCE) Yield/Scrap Others
ASU Department of Industrial Engineering 2004
Steps for Conducting a Process Capability Study
1. Verify that process is stable2. Determine whether the data distribution is normal3. Calculate appropriate indices4. Make recommendations for improvement
ASU Department of Industrial Engineering 2004
ASU Department of Industrial Engineering 2004
Cpk = min{Cpu,Cpl} where
SLSLXCpl
SXUSLCpu
3 and
3==
ASU Department of Industrial Engineering 2004
TL S L U S L
F o u r P r o c e s s e s w ith C p k = 1 .5
C p = 6 .0
C p = 3 .0
C p = 2 .0
C p = 1 .5A :
B :
C :
D :
Cpk alone is not sufficient to indicate the capability of a process
ASU Department of Industrial Engineering 2004
Cpm Alternative to Cpk
Cpm is considerably more sensitive to deviations from target than Cpk
ASU Department of Industrial Engineering 2004
A hotel provides room service meals to its guests. It is hotel policy that the meal is delivered at the time scheduled by the guest.
The hotel Six Sigma team has found from the Voice of the Customer that a breakfast delivered too early will inconvenience the guest as much as a late delivery.
Research indicates that guests require that breakfast be delivered within 10 minutes of the scheduled delivery time.
Example: Hotel Breakfast Delivery
(File: HotelMeals.mtw)
ASU Department of Industrial Engineering 2004
Example: Hotel Breakfast Delivery
24181260-6-12
LSL Target USLP rocess Data
Sample N 725S tDev (Within) 7.20201S tDev (O v erall) 7.16405
LSL -10.00000Target 0.00000U SL 10.00000Sample M ean 6.00357
Potential (Within) C apability
C C pk 0.46
O v erall C apability
Pp 0.47PPL 0.74PPU 0.19Ppk
C p
0.19C pm 0.36
0.46C PL 0.74C PU 0.18C pk 0.18
O bserv ed PerformancePPM < LSL 13793.10PPM > U SL 268965.52PPM Total 282758.62
Exp. Within P erformancePPM < LSL 13138.34PPM > U SL 289479.68PPM Total 302618.02
Exp. O v erall P erformancePPM < LSL 12745.81PPM > U SL 288475.05PPM Total 301220.86
WithinOverall
Process Capability of Delivery Time Deviation
ASU Department of Industrial Engineering 2004
Defects per Million Opportunities
D = Total # of defects counted in the sample Must be at least 5 defects and 5 non-defects to calculate
DPMO
N = # of units of product/service O = # of opportunities for a defect to occur per unit of
product/service M = million
DPMO = 1M Defects .Units Opportunities
ASU Department of Industrial Engineering 2004
Sigma DPMO
2 3087702.25 2267162.5 1586872.75 105660
3 668113.25 400603.5 227503.75 12225
4 62104.25 29804.5 13504.75 577
5 2335.25 885.5 325.75 11
6 3.4
Defects per Million Opportunities vs Process Sigma
ASU Department of Industrial Engineering 2004
D = 205N = 725O = 1
Example: Hotel Breakfast Delivery
DPMO = 1M 205 .725 1
= 282,758
ASU Department of Industrial Engineering 2004
Process Lead Time (PLT)
Littles Law
Customer Orders
Order Entry Credit Check Schedule OrdersOrder Take
Exit Rate = 20 units/day
WIP = 100
PLT= 100/20 = 5 days
Exit Rate (ER) Work IN Process (WIP)
=Process
Lead Time (PLT)
ASU Department of Industrial Engineering 2004
Definitions
Process Lead Time (PLT) The time taken from the entry of work into a process until the work exits the process (which may consist of many activities).
Work-In-Process (WIP) The amount of work that has entered the process but has not been completed. It can be paper, parts, product, information, emails, etc.
Exit Rate (Average Completion Rate or Throughput) The average output of a process over a given period of time (usually a day) (units/time).
ASU Department of Industrial Engineering 2004
Value is Defined by the Customer
Customer Value-Added (CVA)An activity adds value for the customer only if:The customer recognizes the valueIt changes the service/product toward
something the customer expectsIt is done right the first time
ASU Department of Industrial Engineering 2004
Process Cycle Efficiency (PCE)
Process Lead TimeCustomer Value Added Time
=Process
Cycle Efficiency
Customer Orders
Order Entry Credit Check Schedule OrdersOrder Take
Exit Rate = 20 units/day
WIP = 100
CVA=0.4 hrs CVA=0.4 hrs CVA=0.3 hrs CVA=0.4 hrs
PCE = 1.5 hrs/5 days = 1.5 hrs/40 hrs = 3.75%
Assuming 1 day = 8 hrs
ASU Department of Industrial Engineering 2004
Measure Performance
Inputs Team Charter
Business case Goal statement Project scope Project plan Team roles and responsibilities
Prepared Team Critical Customer Requirements Process Maps Quick Win Opportunities
2.0 Measure Performance Determine What to
Measure Manage Measurement Evaluate Measurement
System Determine Process
PerformanceKey Deliverables Input, Process, and
Output Indicators Operational
Definitions Data Collection
Formats and Sampling Plans
Measurement System Capability
Baseline Performance Metrics
Process Capability DPMO PLT PCE Yield/Scrap Others
Productive Team Atmosphere
ASU Department of Industrial Engineering 2004
DMAIC - Process Improvement Roadmap
What is important?
How are we doing?
What is wrong?
What needs to be done?
How do we guarantee
performance?
1.0 Define
Opportunities
2.0Measure
Performance
3.0 Analyze
Opportunity
4.0 Improve
Performance
5.0Control
Performance
IEE 581 Six-Sigma Methodology DMAIC The Analyze Phase
Fall 2012 Class 6
Cheryl L. Jennings, PhD, MBB
1
More on Process Capability Analysis
Previous Measure lecture, measures of process performance included:
Cp, Cpk, Cpm for continuous data
DPMO vs PPM for discrete data
Typically used as a goodness measure of a process performance
In the Measure phase to baseline performance
During the Analyze phase to identify suspect equipment, suppliers, etc., and provide direction to the project
In the Improve phase to compare before performance to after
In the Control phase to monitor ongoing performance
Underlying assumptions are normality, and that the process is in statistical control
2
Relationship Between Cp and Cpk
3 * From Montgomery, D. C. (2009), Introduction to Statistical Quality Control 6th edition, Wiley, New York
Motorola Definition of Six Sigma Quality
Cp and Cpk The Usual Equations
4
Arent these just Point Estimates?
USL-LSL
6
USL LSL min , where3 3
P
PK PU PL PU PL
Cs
X XC C C C and C
s s
Confidence Interval for Cpk
Key Points
The Cpk metric is routinely used.
Recall that the Cpk that we calculate are based on statistics. Therefore our calculated Cpks are used to estimate the TRUE Cpk.
Rarely (if ever) is the confidence interval on a Cpk considered.
Black Belts should consider CIs.
5
2 2
1 1 1 1 1 1.96 1 1.969 2 2 9 2 2
PK PK PK
PK PK
C C CnC n nC n
For a 95% confidence interval:
/2 /22 2
1 1 1 1 1 19 2 2 9 2 2
PK PK PK
PK PK
C Z C C ZnC n nC n
Example
Based on a sample size of n = 13 and an estimated Cpk = 1.11, a 95% confidence interval for Cpk is:
2 2
2
2
1 1 1 1 1 1.96 1 1.969 2 2 9 2 2
1 11.11 1 1.96
9(13)(1.11 ) 2(13) 2
1 11.11 1 1.96
9(13)(1.11 ) 2(13) 2
0.63 1.59
PK PK PK
PK PK
PK
PK
C C CnC n nC n
C
C
6
What if Data is not Normally Distributed?
7
Example
n = 200
Values range from 1001.68 to 2891.49
Histogram shows clearly that data are skewed right and not normal
With LSL = 900 and USL = 2700
Assuming normal data, the usual Cpk estimate would be 0.46
However non-normal Cpk = 1.15
8
Yet Another Process Performance Measure
9
Are these indices really useful?
* From Montgomery, D. C. (2009), Introduction to Statistical Quality Control 6th edition, Wiley, New York
Key Points
The Cpk metric is routinely used
Rarely (if ever) is the confidence interval on a Cpk considered
Black Belts should consider using Confidence Intervals
10
The DMAIC Process
* From Montgomery, D. C. (2009), Introduction to Statistical Quality Control 6th edition, Wiley, New York
Analyze Opportunity
12
3.0 Analyze Opportunity
Identify and Validate Root Causes Basic Tools Advanced Tools
Inputs Input, Process, and Output
Indicators Operational Definitions Data Collection Formats and
Sampling Plans Measurement System Capability Baseline Performance Metrics
Process Capability Cost of Poor Quality (COPQ) Time Yield Other
Productive Team Atmosphere
Outputs Data Analyses Validated Root Causes Potential Solutions
* From Montgomery, D. C. (2009), Introduction to Statistical Quality Control 6th edition, Wiley, New York
The Primary DMAIC Six Sigma Tools
Three Ways to Obtain Data for Analysis
1. A retrospective study using historical data
A lot of data
But generated under what conditions?
Data quality issues
2. An observational study
Planned data collection, under known conditions in a production mode
Typically a short period of time, may not see all variation or be able to see changes in key variables
3. A designed experiment
Also planned data collection, with deliberate manipulation of controllable process inputs
The only way to prove cause-and-effect relationship
Requires commitment of resources
14
Identify Potential Root Causes Basic Analyze Tools
Cause & Effect Diagram*
FMEA*
Cause & Effect Matrix*
Histogram
Scatter Plot
Box Plots
Pareto Diagram
*Discussed in Measure lectures
15
Pareto Diagram
16
80%
Top three complaint categories comprise 80% of problem. Other teams are working on 1 & 2. Your team is tasked with cabin-related complaints. Cabin accommodations generated most complaints related to aircraft cabins; most complaints were about room for carry-on baggage.
In the last year, 65% of airline passenger complaints about aircraft cabin interior baggage accommodations concerned insufficient stowage in overhead bins for carry-on luggage.
Complaints
Nu
mb
er o
f D
efec
ts
Cost Sched Cabin Bags Rgs Tix Etc.
Cabin-related Complaints
Accom. Food Bevs Ent Sound Other
50%
Nu
mb
er o
f D
efec
ts
Cabin Physical Accommodations
Bag Room
Leg Room
Seat Width
Head Room
Rest Room
Other
80%
50%
Bag Accommodations (Storage)
Ovhd Bin
Under Seat
Garment Rack
Other
65%
Identify Potential Root Causes Advanced Analyze Tools
Statistical Process Control (SPC)
Comparative Methods: Hypothesis tests, Confidence intervals
ANOVA
Source of Variation (SOV) Studies
Regression Analysis
Screening Experiments (Designed Experiment, DOE)
Nonparametric Methods
17
Phase I and Phase II Control Chart Application
Phase I Process Taming
Process is likely out of control; as in Measure, Analyze and Improve phases
Use of control charts is to bring process into state of control, with the identification of out-of-control signals and investigation for root cause
Shewhart control charts are suited to Phase I because
Easy to construct & interpret
Effective at detecting both large, sustained process shifts as well as outliers, measurement errors, data entry errors, etc.
Patterns are often easy to interpret and have physical meaning
Also suited to use of sensitizing or Western Electric rules
Phase II Process Monitoring
Process is relatively stable, causes of larger shifts have been identified and permanently fixed; as in Control phase
18
SPC to Identify Potential Causes
In Phase I, control limits are typically calculated retrospectively
Data is collected, say 20 or 25 subgroups
Trial control limits are calculated
Out-of-control points are investigated for assignable causes and solutions
Control limits are recalculated from points within the trial control limits
New data is collected, compared with the revised trial control limits, and the analysis is repeated until the process is stabilized
In Phase II, control limits are calculated from the stabilized process
19
Shewart 3-sigma limits
Why do we often use 3 sigma limits?
... Experience indicates that t = 3 seems to be an acceptable economic value. ...
Economic Control of Quality of Manufactured Product, W.A. Shewhart, Commemorative Issue published by ASQ in 1980, p. 277.
Wider control limits decrease the risk of a type I error, the risk of a point falling beyond the control limits indicating an out-of-control condition when no assignable cause exists
For 3-sigma limits, the probability is 0.0027 (27 out of 10,000 plot points), or 0.0135 in one direction
Wider control limits also increase the risk of a type II error, the risk of a point falling between the control limits when the process is really out of control
20
Comparative Methods
Comparison Type Analysis Tests
Single sample one-to-standard (fixed value)
Z-test t-test 2-test Sign/Wilcoxon
Two samples Paired two-sample
one-to-one Z-test t-test F-test Paired t-test Sign test Wilcoxon Rank Sum (also called the Mann-Whitney test)
Multiple samples multiple ANOVA Kruskal Wallis Use of ranks 2-tests
21
Parametric Inference Methods
We will look at three tests, but fundamentals apply to all tests
The one-sample Z-test
The one-sample t-test
The two-sample t-test (also the pooled t-test)
Assumptions for these three tests are
Random samples
From normal populations
And for two-sample tests, the two populations are independent
Checking for random, independent samples
Best approach is to use a sound sampling plan
Statistical approaches for time-oriented data include runs tests and time series methods
22
Checking Normality
Probability Plot
Boxplot
Goodness-of-fit tests: chi-square, Anderson-Darling
H0: The form of the population distribution for characteristic is Normal.
23
7-Step Hypothesis Testing Procedure
24
1. Parameter of Interest
2. Null Hypothesis
3. Alternative Hypothesis
4. Test Statistic
5. Reject H0 if:
Test statistic approach (fixed significance)
P-value approach
Confidence Interval approach
6. Computations
Includes checking assumptions
7. Conclusions
The One-Sample Z-Test
25
We Could Also Use a P-Value Approach
26
27
An Example of the Z-Test
28
3rd Approach: Confidence
Intervals
The One-Sample t-Test
29
An Example of the t-Test
30
MINITAB 1-Sample t-Test
31
When doing the t-test manually, it is usually necessary
to approximate the P-value
Approximating the P-value with a t-Table
32
Approximating the P-value with MINITAB
33
The Two-Sample t-Test
34
Testing Hypotheses on the Difference in Means of Two Normal
Distributions, Variances Unknown
An Example
35
MINITAB 2-Sample t-Test
36
Other Comparative Tests for Normal Distributions
The Paired t-Test
2 samples, paired data
If analyzed incorrectly as a 2-sample test, the variance estimate may be inflated and give misleading results
2-test
Variance of a normal distribution
F test
Variances of two normal distributions
37
What if the Distribution is Not Normal?
Comparative methods discussed are based on assumption of random sample from a normal distribution
Most of the comparative methods based on the normal distribution are relatively insensitive to moderate departures from normality
Two exceptions are the 2 and F tests for variance
Options for more severe departures from normality are
1. Transform the data to normal, for example using logarithm, square root or a reciprocal, and use a method based on the normal distribution
See Montgomery, DOE, Selecting a Transformation: The Box-Cox Method
2. Utilize a nonparametric or distribution-free approach
38
More Than Two Populations?
For more than two populations or two factor levels aka a single-factor experiment ANOVA can be used for comparing means
39
40
Assumptions can be checked by analyzing residuals
Normality
Independence
Equal variance
Sources of Variation Studies
Sources of Variation (or SOV) studies are used to understand and characterize process variability
Often described as a process snapshot, the process is observed in a production mode without adjustment or manipulation
A sampling plan is designed to encompass what are thought to be the major contributors to process variability
Data is collected over a sufficient period of time to capture a high percentage of the historical process variation
Often suited to analysis as a nested design
May be a precursor to a designed experiment (DOE)
41
Solder Paste Example
A process engineer is interested in determining where the majority of the variability is coming from in the raw material being supplied to a screen-printing process. Three lots of solder paste are randomly selected. From each lot, four tubes of solder paste are selected at random. Three boards are printed for each tube of solder paste.
42 For more on Nested Designs, see Chapter 14 in Montgomery, D. C. (2009),
Design and Analysis of Experiments, 7th edition, Wiley, New York.
2 3 1 Lot:
1 2 3 4 Tube:
Board: 1
2
3
1 2 3 4 1 2 3 4
Tree Diagram
Volume
Measurement: 28
23
23
1
2
3
1
2
3
1
2
3
1
2
3
1
2
3
1
2
3
1
2
3
27
25
24
MINITAB Analysis
Examining p-values, conclude there is no significant effect on Volume due to Lot, but the Tubes of solder paste from the same Lot differ significantly.
Knowing that the major source of variability is the Tube-to-Tube variation within a Lot points gives direction for solving the problem.
Unfortunately, also note that the Within-Tube (Error, or Board-to-Board) variability is the largest source of variation, suggesting improvement in the screen-printing process.
43
Regression Analysis
Recall that two ways to obtain data for analysis included
A Retrospective study using historical data
An Observational study resulting from planned data collection
Regression can be used for both, with care on Retrospective data
Abuses of Regression include
Selection of variables that are completely unrelated in a causal sense a strong observed relationship does not imply that a causal relationship exists. Designed experiments are the only way to determine cause-and-effect relationships.
Extrapolation beyond the range of the original data
We will study logistic regression in a later class lecture on Categorical data analysis
44
Design of Experiments
Types of Experiments
Screening Optimization Comparison Robust Design
Full Factorial Medium Medium High Medium
Fraction Factorial High Low Medium Low
Response Surface Methodology (RSM)
Low High Medium High
Plackett-Burman High Low Low Low
45
The table below lists four types of experiments and the degree of suitability (High, Med, or Low) for each experimental objective
Screening and Comparison experiments are suited for use in the DMAIC Analyze phase
46
Step 4.
Perform Residual Diagnostics
Step 1.
View the Data
Step 5.
Transformation
Required?
Make
Confirmation Runs
Yes
Yes
No
Step 9.
Stop
Experimentation? Run RSM
Yes No
No
Step 8.
Interpret Chosen Model
Step 7.
Choose Model
Step 6.
Reduce Model?
Step 3.
Fit the Model
Step 2.
Create the Model Analysis and Interpretation of Factorial Experiments
Tips for Designed Experiments
Plan Experiment (Use Engineering and Statistical Knowledge)
Objective
Selection of Responses & Input Variables (Operating Range, Levels, Interactions etc.)
Blocking
Replication
Dont forget Center Points!
Conduct Experiment
Randomization
Data collection and Comments
Statistical Analysis
Analyze Experiment
Sparsity of Effects
Statistical Model
Residual Diagnostics
Interpret Results
Results match with engineering intuition
Confidence Interval on Predictions
Confirmation Tests
47
One Tip on How NOT to Design an Experiment
A Designed Experiment is NOT a Retrospective or Observational study
The variables and variable levels are deliberately manipulated in a random manner
A DOE cannot be retro-fitted to data collected retrospectively or through passive observation
48
References
Montgomery, D. C. (2009), Design and Analysis of Experiments, 7th edition, Wiley, New York.
Montgomery, D. C. (2009), Introduction to Statistical Quality Control, 6th edition, Wiley, New York.
Montgomery, D. C. and Runger, G. C. (2011), Applied Statistics and Probability for Engineers, 5th edition, Wiley, New York.
Upcoming
Analyze dataset is posted on Blackboard (both MINITAB and Excel)
Read two case studies posted on Blackboard
Goodman et al, Six Sigma Forum Magazine, November 2007, When Project Termination is the Beginning
Tong et al, Intl Journal AMT, January 2004, A DMAIC approach to printed circuit board quality improvement
How to contact me
E-mail: [email protected]
Cell: 602-463-5134
50
51
IEE 581 Six-Sigma Methodology DMAIC The Analyze Phase
Fall 2012 Class 7
Cheryl L. Jennings, PhD, MBB
1
Fisher 1 in 20
Why do we often use = 0.05 as significance level?
http://psychclassics.asu.edu/Fisher/Methods/chap3.htm, Statistical Methods for Research Workers By Ronald A. Fisher (1925), Chapter III, Distributions
we can find what fraction of the total population has a larger deviation; or, in other words, what is the probability that a value so distributed, chosen at random, shall exceed a given deviation. Tables I. and II. have been constructed to show the deviations corresponding to different values of this probability. The rapidity with which the probability falls off as the deviation increases is well shown in these tables. A deviation exceeding the standard deviation occurs about once in three trials. Twice the standard deviation is exceeded only about once in 22 trials, thrice the standard deviation only once in 370 trials, while Table II. shows that to exceed the standard deviation sixfold would need [p. 47] nearly a thousand million trials. The value for which P =.05, or 1 in 20, is 1.96 or nearly 2 ; it is convenient to take this point as a limit in judging whether a deviation is to be considered significant or not. Deviations exceeding twice the standard deviation are thus formally regarded as significant. Using this criterion, we should be led to follow up a negative result only once in 22 trials, even if the statistics are the only guide available. Small effects would still escape notice if the data were insufficiently numerous to bring them out, but no lowering of the standard of significance would meet this difficulty.
2
How robust is the t-test to the normality assumption?
One assumption for using the t-test for means is that the data is normally distributed
While the test is somewhat robust to this assumption, consider the test statistic calculation
Two key things about this statistic
When sampling from the normal distribution, are independent
The denominator is distributed as
Lets look at an example
Consider a cycle time problem, say the time it takes to process a loan from receipt of application to wiring of funds. Cycle times are often exponentially distributed.
Select a random sample of ten loans and test the hypothesis that the mean cycle time is 10 days
To study the impact of cycle time distribution on the t-test statistic, randomly generate 50 samples of 10 loans each, from an exponential distribution with a mean of 10 days
3
0
Xt
S n
and X S
2~ dfS n
4
Recall that for an exponential distribution, = , so clearly the independence assumption is violated
Histograms of the 50 samples show the skewness of cycle time
A histogram of the 50 t statistics is clearly skewed in comparison to a t distribution with 9 degrees of freedom
Using p-values based on the t distribution could lead to erroneous conclusions
What if the distribution is not normal?
Comparative methods discussed were based on assumption of random sample from a normal distribution
Most of these procedures are relatively insensitive to moderate departures from normality
Options for more severe departures from normality are
1. Transform the data to normal, for example using logarithm, square root or a reciprocal, and use a method based on the normal distribution
See Montgomery, DOE, Selecting a Transformation: The Box-Cox Method
2. Utilize a nonparametric or distribution-free approach
5
Non-Parametric Inference Methods
We will look at two types of tests, tests based on Signs and tests based on Ranks
Distribution-free, or no underlying parametric distribution assumption
However each test does have other assumptions
Why not always use nonparametric methods?
In general, nonparametric procedures do not use all the information in a sample, and as a result are less efficient, requiring larger samples sizes to achieve same power as the appropriate parametric procedure
6
Comparison Type Analysis Tests
Single sample Paired two-sample
one-to-standard (fixed value) Sign Wilcoxon Signed-Rank
Two samples one-to-one Wilcoxon Rank Sum (also called the Mann-Whitney test)
Multiple samples multiple Kruskal Wallis Use of ranks
The Sign Test for One Sample
7
* From Montgomery, D. C. and Runger, G. C. (2011), Applied Statistics and Probability
for Engineers 5th edition, Section 9-9 Nonparametric Procedures, Wiley, New York
8
9
10
Sign Test Example
11
Calculating P-value in Minitab
12
P-value = 2 x Pr(R+ 14) = 2 x Pr(R+ 13) = 2 x (1 0.942341) = 0.1153
13
Or, Minitab 1-Sample Sign Test
14
15
16
17
18
19
20
The Wilcoxon Signed-Rank Test for One Sample
21
* From Montgomery, D. C. and Runger, G. C. (2011), Applied Statistics and Probability
for Engineers 5th edition, Section 9-9 Nonparametric Procedures, Wiley, New York
22
23
Wilcoxon Signed-Rank Example
24
are shown to the left.
Minitab 1-Sample Wilcoxon
25
Uses maximum sum of ranks instead of
minimum
26
Comparison to the t-Test
27
Median Tests for Paired Samples
Both the sign test and the Wilcoxon signed-rank test can be applied to paired observations.
In the case of the sign test, the null hypothesis is that the median of the differences is equal to zero.
The Wilcoxon signed-rank test is for the null hypothesis that the mean of the differences is equal to zero.
The procedures are applied to the observed differences as described previously.
28
29 * From Montgomery, D. C. and Runger, G. C. (2009), Applied Statistics
and Probability for Engineers 4th edition, Wiley, New York
30
1. Parameter of Interest: The parameters of
interest are the median fuel mileage performance
for the two metering devices.
2. Null Hypothesis: H0: Median1 = Median2, or
equivalently, H0: MedianD = 0
3. Alternative Hypothesis: H1: Median1
Median2, or equivalently, H1: MedianD 0
4. Test Statistic: We will use Appendix Table VIII
for the test, so the test statistic is r = min(r+, r).
5. Reject H0 if: For = 0.05, n = 12, two-sided
test, Table VIII gives the critical value as r*0.05 =
2. We will reject H0 in favor of H1 if r 2.
6. Computations: Table 15-2 shows differences
and their signs, r+ = 8 and r = 4. So r = min (8,
4) = 4.
7. Conclusion: Since r = 4 is not less than or
equal to the critical value r*0.05 = 1, we cannot
reject the null hypothesis that the two devices
provide the same median fuel mileage
performance.
EXAMPLE 15-3
An automotive engineer is investigating two different
types of metering devices for an electronic fuel
injection system to determine whether they differ in
their fuel mileage performance. The system is
installed on 12 different cars and a test is run with
each metering device on each car. The observed fuel
mileage performance data, corresponding
differences, and their signs are shown in Table 15-2.
We will use the sign test to determined whether the
median fuel mileage performance is the same for
both devices using = 0.05.
* From Montgomery, D. C. and Runger, G. C. (2009), Applied Statistics
and Probability for Engineers 4th edition, Wiley, New York
Minitab 1-Sample Sign with Paired Data
31
Minitab 1-Sample Wilcoxon with Paired Data
32
The Wilcoxon Rank-Sum Test for Two Samples
33
* From Montgomery, D. C. and Runger, G. C. (2011), Applied Statistics and Probability
for Engineers 5th edition, Section 9-9 Nonparametric Procedures, Wiley, New York
34
35
36
Wilcoxon Rank-Sum Example
37