INTERPRETATIVE EQA - ACB South Western And Wessex Region

Download Report

Transcript INTERPRETATIVE EQA - ACB South Western And Wessex Region

UK NEQAS
ACB Training Course, Plymouth 2007
External Quality Assessment
Finlay MacKenzie
Deputy Director UK NEQAS Birmingham
Organiser UK NEQAS for Thyroid Hormones
with major contributions from
David Bullock, Jane French & Jonathan Middle
UK NEQAS, PO Box 3909, Birmingham B15 2UE
tel: 0121 414 7300 fax: 0121 414 1179,
email: [email protected] web: www.ukneqas.org.uk
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #1
© UK NEQAS 2007
August 2007
Essential points for these sessions
• I am going to cover the essentials of EQA theory and practice,
the UK NEQAS way of designing and operating schemes and
the educational side of our work (the 'extras')
• The session is designed to be relaxed and interactive; I want
you to disagree, challenge, question, so go for it!
• I have a lot of information to get across in a short time so I will
proceed briskly
• You must guide me on what you have already covered or what
you need me to go into in more depth
• If you are lost or confused, you will probably not be alone, so
stop me immediately and ask questions
• There will be some jokes and mild verbal abuse, so please don't
be offended!
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #2
© UK NEQAS 2007
August 2007
External Quality Assessment
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #3
© UK NEQAS 2007
August 2007
CPA(UK)Ltd EQA Standards and a QMS
A Organisation and quality
management system
B Personnel
C Premises and environment
H Evaluation
and quality
assurance
D Equipment, information
systems and materials
PARTICIPANT
Satisfaction or
dissatisfaction
E Organisation and
design of each EQA
scheme
Requirements
F Operation of the
EQA scheme
Input
UK NEQAS
www.ukneqas.org.uk
G Communication
with the paticipants
Plymouth 2007 #4
PARTICIPANT
Output
© UK NEQAS 2007
August 2007
NEED FOR COMPARABILITY
• Mobility
• patients
• medical staff
• Common interpretation
• reference intervals
• decision criteria
• [Pursuit of trueness (the right result)]
• traceability to reference procedures
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #5
© UK NEQAS 2007
August 2007
EXTERNAL QUALITY ASSESSMENT
NEED FOR CONTINUITY OF PATIENT CARE
• between healthcare institutions
• Comparability (lack of bias) is addressed
• trueness is a secondary aim
• Applied retrospectively
• no control over output
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #6
© UK NEQAS 2007
August 2007
ROLE OF EXTERNAL QUALITY
ASSESSMENT
•
•
•
•
EQA provides assessment of:
the overall performance (state of the art)
the influence of analytical procedures
(method, reagent, instrument, calibration)
individual laboratory performance
the specimens distributed
EQA PROVIDES AN EDUCATIONAL STIMULUS TO
IMPROVEMENT
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #7
© UK NEQAS 2007
August 2007
Educational EQA and Knowledge Gain
State-of-the-Art
(after survey)
Custom & practice
 specificity
 sensitivity
Clinical
Domain
Clinical
validity
Research
literature
Clinical
opinion
Financial
issues
Politics
Technical
Domain
Competition
Commercial
interests
Marketing
UK NEQAS
www.ukneqas.org.uk
EQA
Domain
Good
Scheme
Design
State-of-the-Art
(before survey)
IQC
Instrument
constraints
Knowledge
Gain
Service
quality
Laboratory
resources
Analytical
validity
Reference
methods
precision
Practical
constraints
accuracy
‘sensitivity’
Plymouth 2007 #8
Reference
materials
Measurement
Domain
Standardisation
‘specificity’
© UK NEQAS 2007
August 2007
EFFECTIVE EQA
To be effective, EQA must be accepted and seen as
useful, requiring:
•
•
•
•
•
full, regular participation
specimens treated as routine
confidence in scheme design
remedial action taken
an understanding that it is not necessarily bad to
'fail' occasionally, providing lessons are learnt
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #9
© UK NEQAS 2007
August 2007
External Quality Assessment
Scheme PROVIDER
1. Clinical material dispatched
to the USER laboratory
5. RESULTS from all
participating USER laboratories
analyzed and a report indicating
the performance of an individual
laboratory's performance in
relation the performance of all
participating laboratories
6. Report indicating the
performance of an individual
USER laboratory's performance
in relation to the performance of
all participating laboratories
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #10
External Quality Assessment
Scheme USER
2. Clinical Material received
by the USER laboratory
3. Clinical material examined
by the USER laboratory and
the results recorded
4. Examination results
returned to the External
Quality Assessment scheme
PROVIDER
7. USER laboratory reviews its
performance in relation the to the
performance of all participating
laboratories and takes action to
remedy any problems
© UK NEQAS 2007
August 2007
The importance of good
(UK NEQAS) Scheme Design
You will hear this mentioned a lot today!
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #11
© UK NEQAS 2007
August 2007
UK NEQAS Scheme Design - 1
• sufficient recent data:
• frequent distributions
• rapid feedback of performance information
• an appropriate basis for assessment:
• stable, homogeneous specimens
• reliable and valid target values
(best estimate of the 'truth' for that analyte)
• effective communication of performance data:
• structured, informative and intelligible reports
• a rolling time window scoring system
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #12
© UK NEQAS 2007
August 2007
UK NEQAS Scheme Design - 2
• Effective data processing
• performance scores which are easy to understand and
compare by method and over time
• State-of-the-art reports
• clear house style
• structured for ease of use by all levels of staff
• Added value
•
•
•
•
UK NEQAS
www.ukneqas.org.uk
analytical (eg recovery, interference) & interpretative exercises
method reports (where appropriate)
web services
‘unlimited’ advice
Plymouth 2007 #13
© UK NEQAS 2007
August 2007
UK NEQAS Structured Reports - 1
• Participants differ in their needs
• some want minimal data
• some want a lot of data
• Different staff have different needs
• laboratory Directors want summaries
• analysts want detailed information
• Laboratories' needs change
• limited data when all is OK
• detailed examination for problems
• REPORT STRUCTURING ADDRESSES THESE NEEDS
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #14
© UK NEQAS 2007
August 2007
UK NEQAS Structured Reports - 2
•
•
•
•
•
•
•
•
•
•
Are my results and method details correct?
What is the target value and how is it derived?
How do my results relate to the target value?
How do my results relate to other users of my method and or
method principles / standardisation strategies?
What are my current performance scores?
Do they appear in simple graphical form?
Are my scores stable, getting worse or getting better, and can I
easily see why?
Can I tease out total error, bias, consistency of bias and
concentration-related effects?
Can I easily compare the performance of other methods?
Can I find out where to get advice or further help?
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #15
© UK NEQAS 2007
August 2007
EQA - SCHEME DESIGN
For EQA success, participants must have
CONFIDENCE in the scientific validity as well as
reliability of its operation, or they will not take action
on information from the scheme
Experience indicates essential design criteria:
• sufficient recent data
• an appropriate basis for assessment
• effective communication of performance data
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #16
© UK NEQAS 2004
2007
August 2007
UK NEQAS
SCHEME DESIGN - 1
• sufficient recent data, achieved through:
• frequent distributions
• at least 4 per year
• monthly ideal
• rapid feedback of performance information
• before the next distribution
• less than a week ideal
• an appropriate basis for assessment
• effective communication of performance data
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #18
© UK NEQAS 2007
August 2007
SCHEME DESIGN - 2
• sufficient recent data
• an appropriate basis for assessment, including:
• stable, homogeneous specimens
• behaviour like clinical specimens
• reliable and valid target values
• effective communication of performance data
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #19
© UK NEQAS 2007
August 2007
TARGET VALUES - 1
Reference
method
Reference Consensus
labs
Availability



Timeliness





?



Validity
Cost
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #20
© UK NEQAS 2007
August 2007
TARGET VALUES - 2
• Single target (eg ALTM):
•
•
•
•
ideal objective
promotes true comparability
susceptible to matrix effects
inappropriate for some analytes (enzymes, free
hormones)
• Method-related targets (eg method means):
•
•
•
•
•
UK NEQAS
www.ukneqas.org.uk
minimise matrix effects
inter-method differences may not be significant
less robust (fewer results)
some participants 'excluded'
can perpetuate artefactual bias
Plymouth 2007 #21
© UK NEQAS 2007
August 2007
SCHEME DESIGN - 3
• sufficient recent data
• an appropriate basis for assessment
• effective communication of performance data,
through:
• structured, informative and intelligible reports
• a running scoring system
SUMMARISED HERE - MORE DETAILS LATER!
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #22
© UK NEQAS 2007
August 2007
REPORT STRUCTURING
• Participants differ in their needs
• some want minimal data
• some want lots of data
• Different staff have different needs
• laboratory Directors want summary data
• analysts want detailed information
• Laboratories' needs change
• limited data when all is OK
• detailed examination for problems
REPORT STRUCTURING ADDRESSES THESE NEEDS
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #23
© UK NEQAS 2007
August 2007
Structured reports - 1
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #24
© UK NEQAS 2007
August 2007
Structured reports - 2
UK NEQAS
Structured reports - 3
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #26
© UK NEQAS 2007
August 2007
'Snapshot' Page
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #27
© UK NEQAS 2007
August 2007
Detail From Snapshot Page
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #28
© UK NEQAS 2007
August 2007
SCORING - 1
• Purpose of scoring:
• comparison of performance over:
Time
Place
Individual lab


All participants


• Requirements of scoring:
• robust
• independent of other participants' performance
• a 'z score' (SDD) based on observed SD is not satisfactory
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #29
© UK NEQAS 2007
August 2007
SCORING - 2

Participation (return rate)

Non-analytical errors ('blunders')
?
Accuracy (total error) – single survey

Accuracy – running

Imprecision

Bias – running

Consistency of bias - running
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #30
© UK NEQAS 2007
August 2007
Problem with SD differences rather than % bias differences
This situation is for Total T4 at a ALTM target level of 150 nmol/L
Trial 1
Your result 120
Your bias -20%
Target 150
Sample SD 20
Your SD difference 1.5
Trial 2
Your result 125 Target 150
Your bias -16.7%
Sample SD 10
Your SD difference 2.5
Your result for Trial 2 is nearer the target than for Trial 1, but your score worsens
Trial 3
Your result 115 Target 150
Your bias -23.3%
Sample SD 30
Your SD difference 1.16
Your result for Trial 3 is further from the target than it was for Trial 1,
but your score improves!
Cynics might say that SD scores tell you more about other people’s performance
rather than your own!
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #31
© UK NEQAS 2007
August 2007
SCORING - 3
• Percentage bias:
• bias = (result - target) / target * 100 %
• transformed bias:
• = bias * 'degree of difficulty factor' [normalised?]
• A score [Accuracy]
• trimmed mean of transformed biases, without sign
• B score [Bias]
• trimmed mean of percentage biases
• C score [Consistency of bias]
• trimmed SD of percentage biases
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #32
© UK NEQAS 2007
August 2007
Interpreting A
Unacceptable
'Tolerable'
Desirable
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #33
© UK NEQAS 2007
August 2007
Interpreting B & C
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #34
© UK NEQAS 2007
August 2007
INTERPRETATION - 2
A score
B score
C score
Small
Small
Small
Satisfactory
Large
Large
Small
Proportional bias
Large
Small
Large
Variability*
Large
Large
Large
Bias + variability*
* variability is not the same as imprecision
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #35
© UK NEQAS 2007
August 2007
INTERPRETATION - 3
Sources of variability (lack of consistency):
• imprecision
• [short term changes in bias]
• concentration-related bias
• non-linearity
• bias changes with time
• non-analytical errors
VARIABILITY IS NOT THE SAME AS IMPRECISION
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #36
© UK NEQAS 2007
August 2007
The C of the ABC is not imprecision
Laboratory result
6
Low concentrations Positive bias
Mid concentrations Unbiased
High concentrations Negative bias
5
4
3
2
1
0
0
1
2
3
4
Target Value
5
6
By definition :Bias, on average negligible, with good reproducibility, but nevertheless
Consistency of bias is poor - C score is HIGH
(e.g. low level specificity problem compounded with a calibration error).
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #37
© UK NEQAS 2007
August 2007
Understanding limitations of EQA scheme design
• Materials - close to, but not identical to individual patients (except in
rare cases eg HbA1c), may be pooled, may be processed, however
we prefer sparkling primary colours to brown Windsor soup!
• Frequency - EQA can only sample a small number of your assays
and cannot therefore be used as a substitute for good IQC
• Concentration range - ideally set to challenge the main clinical
decision points and, unlike IQC, will cover the full working range
• Target values - appropriate for the analyte and state of the art, but
possibly not ‘ideal’ (reference measurement system values)
• Performance scores - essentially ‘arbitrary’, but hopefully intuitive
and robust and capable of providing broad trend information on
accuracy and comparability
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #38
© UK NEQAS 2007
August 2007
EQA is a challenge
• For EQA to assess performance adequately it must be challenging
• If it is not challenging, the information provided will be bland and
probably not helpful
• If it is challenging, then all labs will occasionally ‘fail’, some (if they
fail a number of times) may become poor performers and a few of
these may have persistent enough poor performance to come to the
attention of the NQAAP
• This is not a disaster! If your quality system is adequate, you can
use it to identify the problem and correct it - most poor performance
is transitory
• CPA only requires evidence of an effective quality system, not
perfect performance
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #39
© UK NEQAS 2007
August 2007
External Quality Assessment (UK NEQAS)
• Adjunct to effective IQC - not a replacement
• Looking retrospectively at accuracy, consistency, overall
comparability, some non-analytical elements (interpretation)
• Educational and supportive - not a threat!
• Not ‘in league with CPA' - they are happy if you participate in an
approved scheme and have effective mechanisms for dealing
with occasional aberrant performance
• Know your scheme design and understand its limitations
(materials, concentration range, data processing, scoring)
• Close the loop - have a system of report review and fully
documented corrective action
• PROVIDES ESSENTIAL INFORMATION AND ADVICE
UNOBTAINABLE ELSEWHERE
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #40
© UK NEQAS 2007
August 2007
UK NEQAS
UK NEQAS
UK NEQAS
UK NEQAS
UK NEQAS
UK NEQAS
Take-home message
Good scheme design underpins any statistics and graphics
Use graphical output wherever possible
Have the supporting raw data available on request
Give the data in a structured format
Only drill down as far as you need
Act on your EQA data ~ it contains a mass of information that is
impossible to get from any other source.
EQA data is used to underpin guidelines and can lead as well as
follow clinical practice eg eGFR
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #47
© UK NEQAS 2007
August 2007
CONCLUSIONS
• EQA complements QA and IQC
• Confidence in scheme design essential
• Interpretation should be simple
• More exciting developments coming!
• laboratory services
• individual services
• Performance is a professional responsibility
UK NEQAS
www.ukneqas.org.uk
Plymouth 2007 #48
© UK NEQAS 2007
August 2007