Six Sigma Versus DFSS

Download Report

Transcript Six Sigma Versus DFSS

Looking into the Future
of Design for Six Sigma (DFSS)
• Description of past
deployments
• Comparison and observations
• Suggestions for the future
© Statistical Design Institute, LLC. All Rights Reserved.
Jesse Peplinski
January 16, 2012
Six Sigma Versus DFSS
Define the
Design Problem
Capture the Voice
of the Customer
Identify Critical
Requirements
• You can use a flexible
Improve or
Create New
Design
approach to let each design
problem dictate which
process is followed
• Use DFSS as a rigorous
method for creating a design
to satisfy multiple
requirements
© Statistical Design Institute, LLC. All Rights Reserved.
Create Design
Concept
Measure the
Requirements
Build Math
Models
Analyze the
Root Causes
Optimize
the Design
Improve the
Design
Validate
the Design
Control the
Root Causes
MAIC
data-driven method for
design improvements
DFSS
• Use Six Sigma (MAIC) as a
Select
Approach
Improve
Existing
Design
Page 2
What is a “Deployment”?
• A company-specific attempt to inject Six Sigma
and/or DFSS into its culture and daily activities
• Typically a customized mixture of:
– Training classes with tailored content
– Structure for projects and “belt” certification
– Supporting software tools
– Strategic communication by management and leadership
• Scope of implementation can vary widely
– All employees vs. targeted teams
– Local vs. global
© Statistical Design Institute, LLC. All Rights Reserved.
Page 3
Past DFSS Deployments
Company
Description
Status
Automotive 1
• Global deployment
• Mandatory training for all
engineers
• Projects and certifications
Low level of
activity
Automotive 2
• Local deployment
• Training and tools for selected
experts based on role or skills
Continued
success
Defense 1
• Emphasis on black belts and
projects
• DFSS as an afterthought to six
sigma
Low level of
activity
Defense 2
• Leadership evolved a design
process intertwined with DFSS
tools
Continuing
activity
© Statistical Design Institute, LLC. All Rights Reserved.
Page 4
Past DFSS Deployments
Company
Description
Status
Electronics 1
• Global deployment, mandatory
training
• Projects and certifications
Low level of
activity
Electronics 2
• Local deployment for product
teams
• DFSS tools folded into an
internal process excellence
program
Steady
continuing
activity
Healthcare 1
• Global deployment with
projects and certification
• Significant backlash and years
of inactivity
Quiet
resurgence
through design
reviews
Healthcare 2
• DFSS integrated into
development process
• Emphasis on providing DFSS
tools
Continued
activity
© Statistical Design Institute, LLC. All Rights Reserved.
Page 5
Observations
• Pendulum swing
– Larger, top-down deployments often end up with lower
levels of long-term practice.
• Backlash against projects and certification
– Long-term health of deployment correlated with selective,
low-key implementation
• Challenge of demonstrating DFSS savings
– Heroes get visibility for fixing mistakes; cost avoidance is
difficult to recognize.
• Tools stand the test of time
– Six Sigma: Gage R&R, SOP’s, DOE, process control
– DFSS: QFD, Pugh Matrix, Monte Carlo, Optimization
© Statistical Design Institute, LLC. All Rights Reserved.
Page 6
Suggestions for the Future
• Design for Six Sigma:
– DFSS tools fit naturally within a systems engineering
group. (If you don’t have a systems engineering group,
consider starting one.)
– In addition, DFSS tools should be leveraged by your key
participants in design reviews. (Principals, architects, etc.)
– DFSS success hinges on modeling and simulation
capability. Be prepared for resistance.
• Six Sigma:
– Let DMAIC flow naturally from leadership asking questions
and demanding answers with data
• Let plans for training and employee reward be
driven by the forces above. (Not vice-versa.)
© Statistical Design Institute, LLC. All Rights Reserved.
Page 7
How does DFSS fit within Systems
Engineering?
Product
Development
Process
Best Practice
SE/DFSS Enablers & Tools
Voice of the Customer
Quality Function Deployment
Exploration
S
E
&
D
F
S
S
Conceptual
Design
TRIZ & Design Selection
Identify Critical
Requirements
Failure Modes & Effects Analysis
Physics and First Principles
Create Design Concept
• First – use the Tools
Build Models to
Detail
Design
Design
Verification
Initial
Production
Final
Production
© Statistical Design Institute, LLC. All Rights Reserved.
DOE and Regression
Statistical Allocation
and Monte Carlo
support Sensitivity
the Analysis
Cost
and Reliability Analysis
Process
Optimize the Design
• Allocate Variability
• Analyze Variability
• Optimize Variability
Validate the Design
Multi-Objective Optimization
FMEA & Fault Tree Analysis
Test Effectiveness Analysis
Design that best meets
all requirements
SE/DFSS Process
Scorecards
Page 8
Modeling and Analysis within DFSS
Require that this be done everywhere, and if it isn’t, explain why not!
Understanding
Requirements,
Specifications,
& Capabilities
Applying
Models &
Analyses
Non-Compliance refers to any
condition that results in Defects
or Off-Spec conditions
A
B
C
D
E
Product Model
(equation,
simulation,
workbook,
hardware, etc.)
Predicting
Probability of
Non-Compliance
Y
PNC
“Noncompliant”
LL
“Compliant”
T
“Noncompliant”
UL
The fundamental metric is the Probability of Non-Compliance (PNC)
© Statistical Design Institute, LLC. All Rights Reserved.
Page 9
Modeling: Easier than It May Appear
Key Design
Parameters
(X’s)
Gather Design
Parameter
Information
Can equations
be developed?
Yes
Fast, Accurate
Math Model
No
Yes
Critical
Requirements
(Y’s)
Identify
Existing
Models
A simulation of
sufficient
accuracy exists?
Yes
No
Simulation
computes
very quickly?
No
Best Design
Alternative(s)
Historical
data exists?
Yes
Perform
Regression
Analysis
No
Create New
Models
© Statistical Design Institute, LLC. All Rights Reserved.
No
Prototypes
exist?
Yes
Perform a
Design of
Experiments
Page 10
Six Sigma Examples
• What can we do to
improve our process yield?
Our goal is to get
solid answers:
~
~
It starts with hard
problems:
• How can we reduce
• How can we increase the
throughput of our call
center?
will reduce operating
temperatures by 11 °C.
~
~
increase sales volume?
supplier B will improve
yields by 8%.
• This power supply redesign
operating temperatures
and fix our thermal issues?
• What can we do to
• Switching from supplier A to
• A $50 rebate would increase
sales by 15%.
• Adding two more operators
will increase throughput by
100 calls per day.
How do we bridge the gap with high levels
of confidence based on solid evidence?
© Statistical Design Institute, LLC. All Rights Reserved.
Page 11
Guiding Questions
Answer these questions to bridge the gap:
1. What is our current state?
– Product or process performance in
measurable terms (Y’s)
If we can’t measure it, we
don’t know where we are.
2. What is our desired state?
– How much improvement is needed
in our measurable Y’s?
If we can’t measure it, we can
never know if we get there.
3. How good are our measurement systems?
– If we measure the same thing twice, do we get the same answer?
– If we made a process improvement, could we detect it?
4. What data do we need to collect?
– Responses (Y’s) and Parameters (potential X’s)
– How much data? Time period? Shifts?
– Existing data? Or new data collection effort?
© Statistical Design Institute, LLC. All Rights Reserved.
Page 12
Guiding Questions
Continued
5. If the Y is plotted versus the X’s, is there evidence of
correlation (patterns) for some of the X’s? Which ones?
– May begin to indicate the significant drivers for improvement
6. Is there statistical evidence that the Y changes when some
X’s change? Which ones?
– Type of analysis used (t-Test, F-Test, ANOVA, etc.)
– Confidence level
7. What changes in the X’s are needed to achieve the desired
state?
Implement Six Sigma as a process for
answering these questions.
© Statistical Design Institute, LLC. All Rights Reserved.
Page 13
Thank you…
Questions?
Contact: [email protected]
© Statistical Design Institute, LLC. All Rights Reserved.
Page 14