common cause variation

Download Report

Transcript common cause variation

www.icnarc.org
Analysis and presentation of
quality indicators
Dr David Harrison
Senior Statistician, ICNARC
www.icnarc.org
Analysis and presentation of QIs
• Principles of statistical process control
• Comparison among providers
• Continuous monitoring over time
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Analysis and presentation of QIs
• Principles of statistical process control
– Common cause variation
– Special cause variation
– Control limits
• Comparison among providers
• Continuous monitoring over time
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Principles of statistical process control
• Common cause variation
– Variation cannot be eliminated
– Some variation is inherent to any process
– This is termed “common cause variation”
– To reduce common cause variation we
need to change the process
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Five signatures…
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
They are not identical…
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
They are not identical…
…but they are all my signature
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
We could rank them…
1.
2.
3.
4.
5.
…but this doesn’t make much sense!
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
We could reject some as low quality…
…but they are still my signature!
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
This is common cause variation
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Principles of statistical process control
• Special cause variation
– Some variation is the result of external
factors acting on a process
– This is termed “special cause variation”
– To reduce special cause variation we
need to identify the source and
eliminate it
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Now we have a sixth signature…
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Now we have a sixth signature…
…it’s a good try, but I think you can
tell which one is the forgery!
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
This is special cause variation
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Control limits
• Statistical process control is all about
making allowance for common cause
variation to detect special cause
variation
• To do this we place control limits
around a process
• Control limits represent the acceptable
range of common cause variation
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Control limits
• Typically control limits of 2 and 3 SDs
represent “alert” and “alarm”
• If a system is in control:
– 95.4% of values within 2 SDs
– 99.7% of values within 3 SDs
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Analysis and presentation of QIs
• Principles of statistical process control
• Comparison among providers
– League tables
– Caterpillar plots
– Funnel plots
– Over-dispersion
• Continuous monitoring over time
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Comparison among providers
• I’ll assume we have a binary event
(e.g. death) and an associated risk
estimate (e.g. predicted risk of death)
• Most common QI is:
observed events / expected events
• (for mortality this is the standardised
mortality ratio)
• How should we compare this QI among
providers (e.g. critical care units)?
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
League tables
• Journalists love them
– High impact
– Everyone wants to know who is first
and last
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
League tables
• Journalists love them
– High impact
– Everyone wants to know who is first
and last
• Statisticians hate them
– Overemphasise unimportant differences
– Even if there is no true difference,
someone will be first and someone last
– No account of role of chance
(common cause variation)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
• League table of 52 IVF clinics ranked
on live birth rate
• Monte Carlo simulation to put 95% CI on
ranks
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
• King’s College Hospital – sixth from
bottom – is the only one that can
reliably be placed in the bottom 25%
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
• BMI Chiltern Hospital – seventh from
bottom – may not even be in the
bottom 50%
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
• Five clinics can confidently be placed
in the top quarter
Analysis and presentation of quality indicators | Dr David Harrison
*
*
**
*
www.icnarc.org
Marshall & Spiegelhalter, BMJ 1998
• Southmead General – ranked sixth from
top – may not be in the top 50%
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Caterpillar plots (or forest plots)
• Plot of QIs with CIs in rank order
• Still a league table really
• But at least acknowledges variation by
including CIs
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Caterpillar plot – IV clinics
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Caterpillar plot – ANZICS
• SMRs by APACHE III-J for 106 adult ICUs
in Australia and New Zealand, 2004
(Cook et al. Crit Care Resusc 2008)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Funnel plots
• Larger sample = greater precision
• If you plot QI against sample size, you
expect to see a funnel shape
• We can plot funnel shaped control
limits
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
|
www.icnarc.org
Funnel plot – ANZICS
• SMRs by APACHE III-J for 106 adult ICUs
in Australia and New Zealand, 2004
(Cook et al. Crit Care Resusc 2008)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Funnel plot – ANZICS
• Note: use of normal distribution can result in
negative confidence intervals – better methods
exist
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Funnel plot – ANZICS
• Note: as SMR is a ratio measure, we would
advocate plotting on a log scale (i.e. SMR=2
and SMR=0.5 are equidistant from SMR=1)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Funnel plot – SICSAG
• SMRs by APACHE II for 25 adult ICUs in
Scotland, 2009
(SICSAG Audit of critical care in Scotland 2010)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Funnel plot – SICSAG
• Note: as the model is poorly calibrated, most
units are “better than average” – the funnel
has been centred on the average SMR not 1
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Over-dispersion
• Variability more than expected by
chance
• Suggests important factors that vary
among providers are not being taken
into account
• Too many providers classified as
“abnormal” (i.e. outside the funnel)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Over-dispersion – hospital readmissions
(Spiegelhalter. Qual Saf Health Care 2005)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Over-dispersion – what to do…?
• Don’t use the indicator?
• Improve risk adjustment
• Adjust for it
– Estimate “over-dispersion factor” by
“Winsorisation”
• Use random effects models
– Assumes each provider has their own
true rate from a distribution
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Example – over-dispersion factor
2.00
1.00
0.50
0.25
0
500
1000
Number of admissions
1500
• SMRs by ICNARC model for 171 adult
ICUs in England, Wales & N Ireland, 2009
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Example – over-dispersion factors
2.00
1.00
0.50
0.25
0
500
1000
Number of admissions
1500
Note: Overdispersion factor 1.4 based on 10% Winsorised
• Over-dispersion factor estimated at 1.4
• Funnel widened
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Analysis and presentation of QIs
• Principles of statistical process control
• Comparison among providers
• Continuous monitoring over time
– RAP chart
– EWMA
– VLAD
– R-SPRT
– CUSUM
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Continuous monitoring over time
• Various approaches
• In general, they consist of…
– an indicator that is updated for each
consecutive patient
– control limits
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Example for continuous monitoring
• Queen Kate Hospital
• Fictitious critical care unit
• Random sample of 2000 records from
the Case Mix Programme Database
• After 1000 records, outcomes changed
so that an extra 6% of patients
(selected at random) die
• Risk adjustment by the ICNARC (2009)
model
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Queen Kate Hospital – SMRs
1.6
1.4
1.2
1.0
0.9
0.8
0.7
Consecutive blocks of 250 patients
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
RAP chart
• Risk-adjusted p chart
• Cohort divided into discrete blocks
(e.g. 100 patients)
• Indicator is observed mortality
• Control limits are predicted mortality
+/- 2 or 3 SDs
• Pro
– Displays observed and expected mortality
• Con
– Still in blocks, not sensitive
Analysis and presentation of quality indicators | Dr David Harrison
40%
Mortality
www.icnarc.org
Queen Kate Hospital – RAP chart
30%
20%
10%
0
500
Observed
1000
1500
Number of admissions
Predicted
2 SDs
Analysis and presentation of quality indicators | Dr David Harrison
2000
3 SDs
www.icnarc.org
EWMA
• Exponentially weighted moving average
• Similar to RAP but uses all data up to
the current timepoint
• Data weighted by a smoothing factor so
that most recent data are given most
weight
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
EWMA
• Pro
– Displays observed and expected mortality
– Estimates updated continuously not in
arbitrary blocks
• Con
– Choice of smoothing factor is important –
too little smoothing and plot is
unreadable, too much and plot is
insensitive to changes
Analysis and presentation of quality indicators | Dr David Harrison
40%
35%
Mortality
www.icnarc.org
Queen Kate Hospital – EWMA
30%
25%
20%
0
500
Observed
1000
1500
Number of admissions
Predicted +/-
2 SD
Analysis and presentation of quality indicators | Dr David Harrison
2000
3 SD
www.icnarc.org
VLAD
• Variable life adjusted display
• Cumulative observed minus expected
deaths
• Pro
– Nice easy interpretation
• Con
– Control limits are complex to calculate
curved functions
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Queen Kate Hospital – VLAD
60
40
20
0
-20
0
500
1000
1500
Number of admissions
Analysis and presentation of quality indicators | Dr David Harrison
2000
www.icnarc.org
R-SPRT
• Resetting sequential probability ratio
test
• Tests evidence for/against a specific
hypothesis (e.g. odds of death are
double that predicted by the model)
• Plot of log likelihood ratio
• If bottom line is reached (strong
evidence against hypothesis) then line
resets to zero
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
R-SPRT
• Pro
– Nice statistical properties
– Control limits are horizontal lines
• Con
– Choice of hypothesis to test is arbitrary
– should we test for an OR of 2, 1.5,…?
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Queen Kate Hospital – R-SPRT
10
5
0
-5
-10
0
500
alpha = beta =
1000
Case number
0.01
1500
0.001
2000
0.0001
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
CUSUM
• “Cumulative sum”
• Log likelihood ratio – same as R-SPRT
• “Absorbing barrier” at zero (i.e. never
goes below zero)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
CUSUM
• Pros/Cons as for the R-SPRT plus…
• Pro
– Does not allow credit to build up (as in
R-SPRT) so alerts earlier
– Negative CUSUM (e.g. OR=0.5) can be
plotted on the same axes
• Con
– Cannot detect evidence against
hypothesis
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Queen Kate Hospital – CUSUM
15
10
5
0
0
500
alpha = beta =
1000
1500
Number of admissions
0.01
0.001
2000
0.0001
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
Which method(s) to use…?
• Comparison among providers
– Funnel plot
• Continuous monitoring over time
– EWMA
– or R-SPRT
– or CUSUM
– (VLAD can be used as a display in
conjunction with, e.g., CUSUM for
monitoring)
Analysis and presentation of quality indicators | Dr David Harrison
www.icnarc.org
|