Transcript DFSS

DESIGN FOR SIX SIGMA & ROBUST DESIGN OF PRODUCTS AND PROCESSES FOR QUALITY

Gülser Köksal EM507 METU, Ankara 2009

OUTLINE

• • • • • DFSS – DMAIC vs. DFSS – Different DFSS roadmaps – DMADV Roadmap – How To implement DFSS Robust design examples A review of design of experiments and orthogonal arrays Taguchi’s robust design approach – Loss Fuunctions – Signal-to-Noise Ratios Robust design case study: Cookie recipe design

Slides 3-16 are selected from the following presentation:

To design new products or processes, or to improve the designs of existing ones in order to satisfy customer requirements

DFSS – DMAIC

To improve the existing processes in order to satisfy customer requirements.

Six Sigma

Process Management

To achieve the business results, managing the processes efficiently.

Define Measure Analyze Improve Control

DMAIC

Define the problem with outputs and potential inputs Analyze the existing process: Is the process measured correctly? If so, what is the capability of the process?

Analyze and identify the important factors that cause the variation of the process: Where and when do the defects occur?

Optimize the output by optimizing the inputs: To reach at the six sigma process, what should be the levels of each factor?

Which controls should be done in order to continue process at six sigma?

Improvement Strategies

Customer Requirements Process Capability DMAIC DFSS NO Is the gap small?

Fundamental Redesign

• Design a new product / process • Broad approach • Blank sheet of paper approach • High risk • Longer time span • Addressing many CTQs • Goal: Quantum Leap

YES Iterative Improvement

• Fix an existing process • Narrow Focus • Use current process model • Low risk • Shorter Time Span • Addressing few CTQs • Goal: Improvement

When to Go for DFSS

     Changing customer expectations: by the time the current problems are solved, new problems will occur Technology development: new technologies allow to meet all customer requirements at lower cost or gain a competitive edge Next generation: the existing products remaining lifetime is very short, a successor will be needed soon System limits: the performance gap is due to system / business model configurations that cannot be changed or the available technology does not allow to meet CTQs Process entirely broken: the existing process is unable to meet many CTQs, too many successive DMAIC projects required

Influence on cost, cycle time and quality

MANUFACTURING TRANSACTION %20-30 DESIGN %70-80 DESIGN MANU.-TRAN.

Different DFSS Methodologies

  Several roadmaps have been proposed. They are very similar to each other. The underlying tools are the same

DFSS Methodology: DMADV

D efine

the project goals and customer requirements.

M easure

and determine customer needs and specifications; benchmark competitors and industry.

A nalyze

the process options to meet the customer needs.

D esign

(detailed) the process to meet the customer needs.

V erify

the design performance and ability to meet customer needs.

DFSS Methodology: DCCDI

D efine

the project goals.

C ustomer

analysis.

C oncept

ideas are developed, reviewed and selected.

D esign

is performed to meet the customer and business specifications.

I mplementation

is completed to develop and commercialize the product/service.

DFSS Methodology: IDOV

I dentify

the customer and specifications (CTQs).

D esign

translates the customer CTQs into functional requirements and into solution alternatives.

O ptimize

uses advanced statistical tools and modeling to predict and optimize the design and performance.

V alidate

makes sure that the design developed will meet the customer CTQs.

DFSS Methodology: DMADV

D efine

the project goals, customer requirements, and opportunities

M easure

in detail customer needs and priorities, market conditions, and benchmark competitors

A nalyze

the data collected, prioritize CTQs, determine relations between CTQs and parts/processes

D evelop

concept, innovative solutions, and optimal solutions to product and process design

V alidate

the solutions and implement

DFSS Methodology: DMADV

DEFINE MEASURE ANALYZE DEVELOP VALIDATE TOOLS Project management QFD Benchmarking Value analysis Financial analysis SIPOC IPDS FMEA TRIZ Design scorecards MSA Basic statistical techniques DOE Optimization Simulation Robust design Tolerance design Reliability engineering Design for manufacture and assembly

All methodologies are similar

Define the project goals, customer requirements , and opportunitie s Define Measure in detail customer needs and priorities, market conditions, and benchmark competitors Measure Analyze the data collected, prioritize CTQs, determine relations between CTQs and parts/processe s Analyze Develop concept, innovative solutions, and optimal solutions to product and process design Develop Validate the solutions and implement Validate Identify Design Optimize Verify

How is it implemented?

 2 weeks of DFSS training  Six Sigma BB or GB knowledge required for participation  2 project groups and 1 project per group  In between training and after training there are a lot of MBB coachings (2-3 days/project-month)

Six Sigma Black Belt BB Week 1 BB Week 2 BB Week 3 BB Week 4 Design For Six Sigma DFSS Week 1 DFSS Week 2 Six Sigma Green Belt GB Week 1 GB Week 2

 Or combined Six Sigma / DFSS Black Belt training program  Totally 5 weeks of training  Black Belts work on the design project  Team members may participate on a common project

Robust Design Problem

To make system outputs insensitive to variation in inputs, process and environmental factors.

X 1 X 2 X 3 X 4 X 5 X 6 Product or Process Y 1 Y 2 Y 3 Outputs Inputs (Control Factors) W 1 W 2 W 3 Noise Factors (uncontrollable)

Robust Product Design Example

Making system output robust to environmental usage conditions

Sugar Flour Egg Milk Oil Baking powder Making a cake using a cake mix Taste Texture Inputs Oven type Altitude from sea level Customer requirements (controllable) Noise factors (uncontrollable)

A robust cake mix recipe reduces variability in taste and texture.

Robust Product Design Example

Making system output robust to component variability Utilizing the second degree relationship between system output and inputs What should be the pendulum length to minimize variation in the period?

Pendulum length

Robust Process Design Example

Making system robust to process variability Steam %20 %10 %30 S 2 S 1 What should be the amount of steam blown and amount of water sprayed into the closed system to generate a level of 20% humidity?

Guidelines for Robust Design through Statistical Experimentation

1. Choose control factors and their levels 2. Identify uncontrollable (noise) factors and decide on how they will be simulated 3. Select the response variable(s) and determine the performance measures (mean, standard deviation, SNR, etc.) 4. Setup the experimental layout (choose appropriate design array(s)) 5. Conduct the experiments and collect data 6. Analyze the data (effects, ANOVA, regression) 7. Choose optimal control factor levels and predict the performance measure at these levels 8. Confirm the optimal levels by experimentation

Ortogonal Array

L 4 L 8 L 9 L 12 L 16 L 16 L 18 L 25 L 27 L 32 L 32 L 36 L 36 L 50 L 54 L 64 L 64 L 81

No. of rows

4 8 9 12 16 16 18 25 27 32 32 36 36 50 54 64 64 81

Orthogonal Arrays

Max. no. of factors

3 7 4 11 15 5 8 6 13 31 10 23 16 12 26 63 21 40

Max. no. of factors at these levels 2

3 7 11 15 1 31 1 11 3 1 1 63 -

3

4 7 13 12 13 25 40

4

9 5 21 -

5

6 11 -

Ortogonal Array Construction Example

One factor with 2 levels, 6 factors with 3 levels

Quality (Consumer) Loss

The quality of a product is measured by estimating “the total loss to the customers due to variation in the product’s functions. For ideal quality, loss is zero. Higher the loss, lower the quality.

(b) (b) (b) LSL

Quality Loss Average Quality Loss

µ=T USL

Quality characteristic (X)

= b (x-T)

2

= b ( 

2

+(  -T)

2

) (Taguchi,1989) Loss coefficient

Smaller-the-Better Response

Loss Function L(Y) L(y) y L ( y )  A  2 y 2 L  A  2 ( y 2  s 2 ) • Signal to Noise Ratio:

SNR

  10

log(

1

n i n

  1

y i

2

)

  10

log( y

2 

s

2

)

Examples: • Gas, Energy etc. consumption • Noise • Radiation

Larger-the-Better Response

Loss Function L(Y) L(y) y

L

(

y

) 

A

 2 1

y

2

L

A

 2 1

y

2 ( 1  3

s

2

y

2 ) Examples: • Mechanical power • Strength • Wearout resistance • Signal to Noise Ratio:

SNR

  10 log( 1

n i n

  1 1

y i

2 )   10 log( 1

y

2 ( 1 

s

2 3

y

2 ))

Nominal-the-Best Response

Loss Function L(Y) L(y) y

L

(

y

) 

A

 2

L

A

 2 (

s

2 (

y

  (

y T

) 2 

T

) 2 ) Examples: • Dimension (mm) • Strength • Voltage (V) • Signal to Noise Ratio:

1. Minimize variance

SNR

  10 log

s

2

2. Bring the mean to the target

SNR

 10 log(

n y

2 )

A Robust Design Experiment Layout

i 1 2 3 1 1 1 Control Factors 1 2 3 1 2 3 1 2 3 y 11 y 21 y 31 y 12 y 22 y 32 Noise Factors ………....

………....

………....

………....

………....

………....

………....

y 1n y 2n y 3n Performance Measures y 1 , s 1 2 , SNR 1 y 2 , s 2 2 , SNR 2 m 3 3 2 1 y m1 y m2 ………....

y mn y m , s 2 m , SNR m

Cookie Recipe Robust Design (A larger the-better robust design problem)

Objective:

To find the control factor levels that maximize cookie chewiness under uncontrollable effects of the noise factors.

Control Factors: A

: Cooking temperature

B

: Syrup content

C

: cooking time

D

: cooking pan

E

: Shortening type

Noise Factors:

Z1: Cookie position Z2: Temperature at test

Levels:

Low, high Low, high Short, long Solid, mesh Corn, coconut

Levels:

Side, middle Low, high (Source: W.J. Kolarik, 1995, Creating Quality, McGraw-Hill)

The experimental design layout, and data collected

Chewiness measurements

Z1

: side side middle middle

Z2

: low high low high

s

 y 1 3

i

4   1 (

y i

y

) 2 log e s SNR   10 log( 1 4 i 4   1 1 y i 2 )

Response Tables

A*B A*C A*E B*C B*D B*E C*D C*E D*E

Taguchi Analysis: y1; y2; y3; y4 versus A; B; C; D; E

The following terms cannot be estimated, and were removed.

Response Table for Signal to Noise Ratios Larger is better Level A B C D E 1 22,18 12,69 17,40 16,03 16,74 2 15,70 25,19 20,48 21,85 21,13 Delta 6,49 12,50 3,08 5,82 4,39 Rank 2 1 5 3 4 Response Table for Means Level A B C D E 1 20,250 8,750 12,750 13,750 16,250 2 14,000 25,500 21,500 20,500 18,000 Delta 6,250 16,750 8,750 6,750 1,750 Rank 4 1 2 3 5 Response Table for Standard Deviations Level A B C D E 1 4,774 5,437 5,756 6,132 6,665 2 7,781 7,117 6,798 6,422 5,889 Delta 3,007 1,680 1,042 0,289 0,776 Rank 1 2 3 5 4

Marginal Average (Main Effect) Plots

Main Effects Plot (data means) for SN ratios

A B C 24 21 18 15 12 1 2 D 24 21 18 15 12 1 2 Signal-to-noise: Larger is better 1 1 E 2 2 1 2 Variables A, B, D and E have significant effects on SNR. C does not seem to be significant. But let us check this with ANOVA as well.

Interaction Plots

Interaction Plot (data means) for SN ratios

1 2

A

24 21 18 15 12 A 1 2 24 21 18 15 12 1 Signal-to-noise: Larger is better 2

D

D 1 2 Only AD interaction could be estimated and it seems to be insignificant.

ANOVA for SNR

Analysis of Variance for SN ratios Source DF Seq SS Adj SS Adj MS F P A 1 84,126 84,126 84,126 28,18 0,034 B 1 312,689 312,689 312,689 104,74 0,009 C 1 18,981 18,981 18,981 6,36 0,128 D 1 67,643 67,643 67,643 22,66 0,041 E 1 38,553 38,553 38,553 12,91 0,069 Residual Error 2 5,970 5,970 2,985 Total 7 527,962  

Stat choose

DOE

Taguchi

Analyze Taguchi Design-Analysis ‘fit linear model for Signal to Noise ratios’

Predict Results at the Optimal Levels

 

Stat

DOE

Taguchi

Predict Taguchi Results

Predict Results at the Optimal Levels

Taguchi Analysis: y1; y2; y3; y4 versus A; B; C; D; E Predicted values

S/N Ratio Mean StDev Log(StDev) 35,0761 37,25 5,89143 1,83763 Factor levels for predictions A B C D E 1 2 2 2 2 Conduct confirmation experiments at these levels!

(

SNR

) 

T

 (

A

1 

T

)  (

B

2 

T

)  (

C

2 

T

)  (

D

2 

T

)  (

E

2 

T

)