Six Sigma Overview - ASQ Baltimore 0502

Download Report

Transcript Six Sigma Overview - ASQ Baltimore 0502

Tunnell
Consulting
Discover the Value
Eight Essential Tools for
Process Improvement
• Quality
Baltimore ASQ Section
January 20, 2005
• Capability
• Capacity
• Professionalism
Tunnell Consulting, Inc. / 900 East Eighth Avenue / Suite 106 / King of Prussia, PA 19406 / 610.337.0820
II-1
We Have Learned Much About
Improvement Tools in the Last 25 Years
•
•
•
•
•
Statistical Process Control
Total Quality Management
ISO 9000
Reengineering
Six Sigma
“Those who remain ignorant of history
are doomed to repeat it”
George Santayna
II-2
Let’s Learn from Our Experiences
• Let’s reflect on our experiences to identify ways we
can make more effective use of improvement tools
in the future
• The DMAIC improvement approach
- Define, Measure, Analyze, Improve, Control
takes improvement models to a new level
- Tools integrated into improvement framework
- Tools linked and sequenced
- Includes Define and Control phases
II-3
DMAIC Process Improvement Model
DEFINE
MEASURE
ANALYZE
Black Belts and Green
Belts are trained to follow
the D M A I C process
improvement model.
IMPROVE
CONTROL
DMAIC adds predictability,
discipline and repeatability
to improvement projects.
II-4
Elements of DMAIC
Define the problem
Define
Measure
Measure the gap, listen to the
process
Analyze
Analyze data to determine root
causes
Improve
Improve the process – Implement the
solution
Control
Control the process, standardize, and
document – sustain the gains
II-5
DMAIC Project Roadmap
Define Phase
• Define Problem
• Calculate Financial
Impact ($$)
• Charter Project
• Assign Champion,
BB/GB and Team
• Management
Approval
*Control Plan is Updated in the
Analyze and Improve Phases as
Appropriate
Measurement Phase
• Map process and Identify Inputs and Outputs
• Cause and Effects Matrix
• Establish Measurement System Capability
• Establish Process Capability Baseline
• Evaluate Control Plan*
Analysis Phase
• Perform FMEA
• Perform Multi-vari Analysis
• Identity Potential Critical Inputs
Improvement Phase
• Verify Cause-Effect Relationships
• Identify and Test Improvements
Control Phase
• Implement Control Plan*
• Verify Long-Term Capability
• Transfer Project to Process Owner
II-6
Eight Key Improvement Tools
Measure
Analyze
Maps
Cause and
Effect Matrix
Gage R&R
Capability
Analysis
Failure Modes &
Effects Analysis
Multi-Vari
Studies
Design of
Experiments
Control Plans
and SPC
II-7
Improve
Control
DMAIC Tool Linkage
Customers
Process
SPC
Control Plan
Process
Map
MSA
Process
Capability
C&E Matrix
Multi-Vari
FMEA
II-8
DOE
DMAIC Tools
• Knowledge-Based Tools:
–
–
–
–
Process Map
Cause and Effect Matrix
Failure Modes and Effects Analysis
Control Plan
• Data-Based Tools:
–
–
–
–
Measurement System Analysis
Process Capability Analysis
Multi-Vari Studies
Design of Experiments
II-9
Some Key Six Sigma Tools
PROCESS MAP - A schematic of a process showing process inputs, steps and outputs.
CAUSE AND EFFECT MATRIX - A prioritization matrix that enables you to select those process
input variables (X’s) that have the greatest effect on the process output variables.
MEASUREMENT SYSTEM ANALYSIS- Study of the measurement system typically using Gage
R&R studies to quantify the measurement repeatability and reproducibility.
CAPABILITY STUDY- Analysis of process variation versus process specifications to assess the
ability of the process to meet the specs
FAILURE MODE & EFFECTS ANALYSIS - Analytical approach for identifying process problems
by prioritizing failure modes and their causes
MULTI-VARI STUDY - A study that samples the process as it operates and by statistical and
graphical analysis identifies the important controlled and uncontrolled (noise) variables.
DESIGN OF EXPERIMENTS - A method of experimentation that identifies, with minimum testing,
how key process input variables affect the output of the process
CONTROL PLAN - A document that summarizes the results of a Six Sigma project and aids the
operator in controlling the process.
II-10
Some Key Learnings
•
•
•
•
•
•
•
Focus on identifying the critical few variables
Get team input
Assess and improve measurement systems
Assess process capability
Improve multi-vari studies
Usefulness of screening experiments
Value of Control Plans
II-11
Building Models
Identifying the Critical Few Variables
II-12
Controlled Variables
•
•
•
•
Temperature
Pressure
Flow rate
Catalyst concentration
Process Inputs
Process Outputs
• Raw materials
• Water
• Energy
Manufacturing
Process
•
•
•
•
•
Yield
Waste
Capacity
Downtime
Production rate
Uncontrolled Variables
•
•
•
•
•
•
Ambient conditions - temperature and humidity
Shift
Team
Operators
Machines
Raw material lot
II-13
Controlled Variables
•
•
•
•
Customer service representative training
Inventory level
Shipment method
Promise date
Process Inputs
• Email, fax, phone and
postal service
• Completeness of
customer orders
• Accuracy of customer
orders
Process Outputs
Customer
Order
Process
• Order correctness
• Delivery time
• Package quality
Uncontrolled Variables
•
•
•
•
•
•
Customer service representative attitude
Day of week
Season of year
Customer required date
Shift
Team
II-14
Finding the Critical Few Variables
Process
Inputs
Controlled
Variables
Uncontrolled
Variables
Measure
Analyze
Improve
Control
Critical Few Variables
Process Control and Optimization Enhanced
II-15
DMAIC Improvement Strategy
• Identify those process variables (Xs):
• Process inputs
• Controlled variables
• Uncontrolled variables
that are affecting the process output (Y)
Model:
Y = f(X1, X2, ………, Xp)
• There are typically 3-6 critical Xs that have the
majority of the effect on Y
• Knowing the critical few Xs enables us to control
and optimize the process
II-16
Building Models Helps Us
Find the Critical Few Xs
• Qualitative Models:
• Key variables and direction of effects (positive, negative)
are known
• Quantitative Models:
• Key variables and direction of effects are known
• Magnitudes of effects are known
• Model functional form is known
• Can make predictions
II-17
Example – Types of Models
• Qualitative Model:
• Yield is a function of:
• Temperature (positive effect)
• Pressure (negative effect)
• Quantitative Model:
• Yield = 50 + 10(Temp) – 3(Pressure)
We can calculate (predict) yield given
Temperature and Pressure
II-18
Some Useful Models
Process Output (Y)
Process Variables (Xs)
Weight Gain
Exercise and caloric intake
Auto Accidents
Speed, weather, alcohol,driver age
and experience
Auto Gasoline Mileage
Auto weight, transmission type, air
conditioner
Human Health
Smoking, exercise, rest, fat intake
House Operating Costs
Insulation, desired inside temp,
outside temp, electricity usage
House Market Value
Location, size, baths, lot size
II-19
Y=f(X) – An Example
What Variables Affect Auto Accidents?
Y = Auto accidents X = Causes
X1 = Driver experience
X2 = Drinking
X3 = Distractions like cell phones
X4 = Road conditions
X5 = Weather
X6 = Driver conditions – alertness – related to X3
X7 = Auto conditions – poor brakes
II-20
Auto Accidents
Controlling Critical Variables
Critical Variable
X1 = Driver experience
X2 = Drinking
X3 = Distractions
X5 = Weather
X7 = Auto Conditions
Action Taken
Younger drivers pay higher
car insurance premiums
Legal limit on blood alcohol
level = .08
Some areas have outlawed
using cell phones while driving
Lights must be on when wind
shield wipers are on
Periodic auto inspections
required by law
II-21
DMAIC Tools Help Us
Identify the Critical Xs
Six Sigma Tool
• Baseline Data Analysis
• C&E Matrix
• Measurement System
Analysis
• FMEA
• Multi-vari Studies
• DOE
Critical Xs Identified
Special Cause Variation Sources
High priority process variables
Measurement problems
How the process can fail
Key noise and process variables
Cause and effect relationships
II-22
Get Team Input
• Knowledge-Based Tools:
–
–
–
–
Process Map
Cause and Effect Matrix
Failure Modes and Effects Analysis
Control Plan
• Knowledge-Based tools capture
“tribal knowledge” - What the tribe
knows about the process
- It is essential that a team be involved in
the use of these tools; It is rare that a
single person knows the process well
enough to use these tools
II-23
Six Sigma Tools
What Can Go Wrong
II-24
Measure Phase
Process Map – What Can Go Wrong?
•
•
•
•
Too many steps, too complex
Team not involved in its creation
Doesn’t reflect process currently in use
Cursory analysis of important variables:
• Variables missing or poorly defined
• Differences between controlled and
uncontrolled (noise) variables not understood
• Control Plan not updated
II-25
Measure Phase
C&E Matrix – What Can Go Wrong?
• Team not involved in its creation
• Too many variables evaluated – two phased
approach not used:
• Phase 1 – Rank the process steps
• Phase 2 – Rank the process variables for the
highest ranking process steps
• All process outputs are given equal priority
• Actions not taken on top rated variables when
appropriate:
• Quick fixes, interim control procedures, etc.
II-26
Measure Phase
MSA – What Can Go Wrong?
Measurement System Analysis (MSA)
• Unimportant measures studied
• Full range of process variation not studied
• Sample size too small:
• Items, tests, analysts, instruments
• MSA not done
• Indicated actions not taken
• Control Plan not updated
II-27
Measurement Systems Analysis
• Design of measurement systems is required to develop
effective products and processes
• Should be done as part of the product and process
development process
• Gage R&R studies are used to determine the measurement
capabilities for key process variables (Xs) and process
outputs (Ys)
• Indices for measurement repeatability and reproducibility
are an output of the Gage R&R studies.
II-28
Sources of Variation
Total
Variation
Process or
Product
Variation
Measurement
System
Variation
Accuracy
Precision
Repeatability
(Gage)
II-29
Reproducibility
(Operators)
Measurement Method Ruggedness
A measurement procedure is said to be “rugged,” if
it is immune to modest (and inevitable) departures
from the conditions specified in the method (Youden
1961)
• Ruggedness (Robustness) can be evaluated using
two-level fractional-factorial designs including
Plackett-Burman designs
•
II-30
Viscosity Measurement System Improvement
• This study was initiated because of the perceived
large variation in viscosity measurements
produced by an analytical laboratory.
• This was a concern because viscosity was a key
quality characteristic of a high-volume product.
• It was decided to conduct a ruggedness test of
the measurement process to determine which
test method variables, if any, were influencing the
viscosity measurement.
II-31
Viscosity Measurement System Improvement
Variable
Low Level (-)
High Level (+)
M1
M2
Volume
Weight
X3 = Mixing Speed (rpm)
800
1600
X4 = Mixing Time (hrs)
0.5
3
X5 = Healing Time (hrs)
1
2
S1
S2
Absent
Present
X1 = Sample Preparation
X2 = Moisture Measurement
X6 = Spindle
X7 = Protective Lid
II-32
Viscosity Measurement System Improvement
Run
Test
Sequence
Sample
Prep
Moisture
Method
Mix
Speed
Mix
Time
Heal
Time
Spindle
Safety
Lid
Y
Viscosity
1
5
1
1
1
2
2
2
1
2220
2
4
2
1
1
1
1
2
2
2460
3
18
1
2
1
1
2
1
2
2904
4
19,20
2
2
1
2
1
1
1
2364 2348
5
7
1
1
2
2
1
1
2
3216
6
11
2
1
2
1
2
1
1
3772
7
12
1
2
2
1
1
2
1
2420
8
6,13
2
2
2
2
2
2
2
2340 2380
9
9
2
2
2
1
1
1
2
3376
10
3
1
2
2
2
2
1
1
3196
11
2
2
1
2
2
1
2
1
2380
12
1,16
1
1
2
1
2
2
2
2800 2700
13
14
2
2
1
1
2
2
1
2320
14
10
1
2
1
2
1
2
2
2080
15
15
2
1
1
2
2
1
2
2548
16
8,17
1
1
1
1
1
1
1
2796 2788
17
Exp 2
1
2
1
2
1
1
1
2384
18
Exp 2
1
2
2
2
1
1
1
2976
19
Exp2
1
2
1
2
1
2
1
2180
20
Exp 2
1
2
2
2
1
2
1
2300
II-33
Viscosity Measurement System Improvement
Pareto Chart of the Standardized Effects
(response is Viscosity1, Alpha = .05)
2.78
Factor
A
B
C
D
E
F
G
H
F
C
D
AD
B
Term
E
AC
AG
AH
AB
H
Interaction Effect
AD=14+36+57
G
AE
AF
A
0
10
20
Standardized Effect
II-34
30
40
Name
1SamplePrep
2Measurement
3MixSpeed
4MixTime
5HealTime
6Spindle
7Lid
Dummy
Viscosity Measurement System Improvement
Effects (High – Low) of Variables
Variable
X1=Sample Prep Method
Runs 1-16
Runs 1-20
Runs 1-20
Measures of Model Fit
0
-5
-141
-136
-127
474
465
453
X4=Mixing Time
-305
-300
-291
X5=Healing Time
124
116
115
-646
-612
-602
30
22
-33
-28
12+37+56
-46
-50
13+27+46
77
86
14+36+57
-266
X2=Water Measurement
X3=Mixing Speed
X6=Spindle
X7=Protective Lid
X8=Dummy
-53
36
-218
15+26+47
-17
-10
16+34+25
13
-21
17+23+45
-51
-43
24+35+67
-48
52
II-35
Adjusted
R-Square
Coefficient
of
Variation%
Interactions
14+57
Residual
Std Dev
-263
Runs
1-16
Runs
1-20
Runs
1-20
39
77
84
99.2
96.9
96.1
1.4
2.9
3.1
Viscosity Measurement System Improvement
Normal Probability Plot of the Standardized Effects
(response is Viscosity1, Alpha = .05)
99
C
95
90
Percent
Factor
A
B
C
D
E
F
G
H
E
AC
80
70
60
50
40
30
B
20
AG
AD
D
10
5
F
Interaction Effect
AD=14+36+57
1
-40
-30
-20
-10
0
10
Standardized Effect
II-36
Effect Type
Not Significant
Significant
20
30
Name
1SamplePrep
2Measurement
3MixSpeed
4MixTime
5HealTime
6Spindle
7Lid
Dummy
Viscosity Measurement System Improvement
Mix Speed – Spindle Interaction
Initial Experiment
Prediction
Confirmatory Experiment
Interaction Plot (fitted means) for Viscosity
Interaction Plot (fitted means) for Viscosity
Spindle
1
2
3200
3200
3000
3000
2800
2800
2600
2600
2400
2400
2200
2200
1
Spindle
1
2
3400
Mean
Mean
3400
1
2
2
MixSpeed
MixSpeed
II-37
Measure Phase
Capability Studies – What Can Go Wrong?
•
•
•
•
•
•
Customer specifications not correct / realistic
Inadequate time frame studied
MSA needed and not done prior to capability study
Poor quality data
Sample size too small
Control plan not updated
II-38
How Large A Sample Do I Need?
• It depends on the process, objectives, etc.
• General rule:
– Minimum of 30 samples (1 month) for short-term
capability studies
– Minimum of 90 samples (3 months) for longterm capability studies
• Sampling until the total range of process variation
has been observed is a better general rule
• Capability indices are highly variable when
estimated from small samples
II-39
95% Confidence Interval for Ppk=1
LIMIT
LOWER
MEAN
UPPER
1.3
Mean
1.2
1.1
1.0
0.9
0.8
30
60
SAMPLE SIZE
II-40
120
95% Confidence Interval for Ppk=1.33
LIMIT
LOWER
MEAN
UPPER
1.7
1.6
Mean
1.5
1.4
1.3
1.2
1.1
1.0
30
60
SAMPLE SIZE
II-41
120
95% Confidence Interval for Ppk=1.67
LIMIT
LOWER
MEAN
UPPER
2.2
2.1
2.0
Mean
1.9
1.8
1.7
1.6
1.5
1.4
1.3
30
60
SAMPLE SIZE
II-42
120
Issues with Capability Indices
• Assumes specifications adequately
reflect customer needs
Cp = Voice of Customer = USL - LSL
Voice of Process = 6 (Std Dev)
• Attempt to
reduce process
performance to
one number
• Assumes data are normally distributed
and in-control
• High degree of uncertainty when the
sample size is low
• May include structural variation in the
capability variation calculation and
inflate standard deviation
II-43
Analyze Phase
FMEA – What Can Go Wrong?
Failure Modes and Effects Analysis:
•
•
•
•
•
Team not involved in its creation
Doesn’t reflect the current state of the process
Action not taken on identified opportunities
Poor quality work – improper time allotment
Control plan not updated
II-44
Analyze Phase
Multi-Vari Studies – What Can Go Wrong?
• Variation in the output (Y) variable too small:
• Due to study time frame being too small
• Key process variables ignored:
• Controlled and noise
• Fact that Shift differences reflect differences due to
Time of Day and Operating Team is overlooked
• Sample size too small
• Small variation in the X’s
• Correlated X’s
• Sloppy data collection
• Missing data
• Y measurement error not known
• Incomplete analysis of results
• Control Plan not updated
II-45
Regression Analysis and the
Normal Distribution Assumption
• Regression analysis assumes normality – But normality
of what??
• The normality assumption generally applies to the
residuals of models, not to the raw data
• For example, in the equation:
y = bo + b1x1 + b2x2 + e,
it is the error term, e, that is assumed to be normally
distributed, not y, the raw response term.
• This fact is well-known in statistical circles, but not
necessarily in Six Sigma circles, where many BBs have
been taught to test y for normality
II-46
Improve Phase
Design of Experiments – What Can Go Wrong?
•
•
•
•
•
•
Started before understanding the process
DOE proposal plan not circulated for approval
Sloppy work: planning, execution and analysis
Poor measurements
Design too big or too small
Wrong ranges for the Xs:
• Be safe, practical, bold but not reckless
• Incomplete analysis of results
• Confirmation tests not done
• Control Plan not updated
II-47
Screening Experiments in Six Sigma
Process Map
C&E Matrix
FMEA
Multi-Vari
Studies
List of
Candidate
Xs
2-5
Xs?
NO
YES
Take Action
– Control Xs
– Select Operating
Ranges
DOE
– Factorial
– Response Surface
II-48
DOE
Screening Experiment
Control Phase
Control Plan – What Can Go Wrong?
• Started too late:
• Should be started immediately after the process map
is created
• Evaluated and updated as appropriate at the end of
each phase of DMAIC
• Control plan not understood by process operators
• Control plan not used
• Control plan out of date
• Sloppy work:
• Missing control actions, reaction plans, etc.
II-49
Summary
• We have learned much over the past 25 years
regarding improvement tools and frameworks
• DMAIC takes improvement to a higher level
- Tools integrated into improvement framework
- Tools linked and sequenced
- Includes Define and Control phases
• Better understanding of the tools will result in
attaining better project results more quickly
II-50
Six Sigma Project Failure Modes
Poor Project Selection and Management:
• Projects not tied to financial results
• Poorly defined project scope, metrics and goals
• Many projects lasting more than 6 months
• Wrong people assigned to projects
• Large project teams
• Infrequent team meetings
II-51
Six Sigma Project Failure Modes
Poor Project Support:
• Black Belts have little time to work on projects
• Technical support from MBB not available
• Poor or infrequent management reviews
• Poor support from Finance, IT, HR, Maint, QC Lab
• Focus is on training not improvement
• Poor communication of initiative and progress
• Lack of appropriate recognition and reward
II-52
References
Statistical Thinking and Six Sigma
Hoerl, R. W. and R. D. Snee (2002) Statistical Thinking Improving Business Performance, Duxbury Press, Pacific
Grove, CA.
Snee, R. D. and R. W. Hoerl (2003) Leading Six Sigma – A Step
by Step Guide Based on Experience with GE and Other Six
Sigma Companies, Financial Times Prentice Hall, New
York, NY.
Snee, R. D. and R. W. Hoerl (2005) Six Sigma Beyond the
Factory Floor – Deployment Strategies for Financial
Services, Health Care and the Rest of the Real Economy,
Financial Times Prentice Hall, New York, NY.
II-53
For Further Information, Please Contact:
Ronald D. Snee, PhD
Principal, Process and Organizational Excellence
Tunnell Consulting, Inc.
900 East Eighth Avenue, Suite 106
King of Prussia, PA 19406
(610) 213-5595
[email protected]
or visit our website at: www.tunnellconsulting.com
II-54