MU perspective - British Measurement & Testing Association

Download Report

Transcript MU perspective - British Measurement & Testing Association

Valid Analytical Measurement
Studies of Proficiency Testing scheme
performance
S Ellison
LGC Limited, Teddington
The work described in this paper was supported under contract with the Department of Trade and Industry
as part of the Valid Analytical Measurement programme
BMTA July 2005: 1
or
63 routes to the wrong result
... and what to do about it
BMTA July 2005: 2
Introduction
• PT in analytical chemistry
• Why study mistakes?
• How does the UK do?
– PT results compared to international performance
• What goes wrong? (and why)
– Web-based study of causes of poor PT scores
BMTA July 2005: 3
PT in analytical chemistry organisation
• Typical rounds comprise:
–
–
–
–
test sample preparation, characterisation and distribution
analysis by participants
data collection and processing
preparation and distribution of the report
• Frequency
– Typically 6-12 rounds per year
• Analytes (measured quantities)
– 1-30 per sample per round
• Participants
– Typically 30-100 per round, but strongly scheme-dependent
BMTA July 2005: 4
The aims of proficiency testing
• Primary aim:
“To provide the infrastructure for a laboratory to monitor
and improve the quality of its routine analytical
measurements”
• Other aims
– Provide information on the state-of-the-art in analytical
measurements
– Compare performance of analytical methods
– Assist a laboratory in the validation of new methods
BMTA July 2005: 5
Principle of performance
assessment
Compare…..
• Observed error
• ‘Target range’
– difference between
laboratory result (x) and
assigned value (X)
– usually a standard
deviation (ˆ ) or
uncertainty
…..using an acceptability criterion
BMTA July 2005: 6
Performance Scoring: z-scores
z
x  X 
ˆ
x submitted result
X assigned value
ˆ standard deviation for proficiency assessment
Z  2
2Z  3
Z  3
Satisfactory performance
Questionable performance
Unsatisfactory performance
Interpretation of z is consistent across schemes
but depends on ˆ
BMTA July 2005: 7
Typical analytical performance
data.
Collected food
analysis data:
Various analytes
5 values < -8
-8
-6
3 values >8
-4
-2
0
Z-score
BMTA July 2005: 8
2
4
6
8
PT data for benchmarking
• Three studies of UK performance
– Clinical
– Food
– Environment
• Clinical: Backed by IMEP-17 study (20 analytes: 35
countries)
• Food: FAPAS PT scheme data (6 representative
analytes; 2000 labs; ca. 250 countries and regions)
• Environment: CONTEST and CoEPT project data
BMTA July 2005: 9
UK performance: Clinical
UK performance:
Consistent with
others;
Rarely poor
BMTA July 2005: 10
UK Performance: Food
GeMMA
by country
GMOz-score
measurement
UK
Rest of world
-5
0
z-score
BMTA July 2005: 11
5
UK Performance: Food
Aflatoxins
BMTA July 2005: 12
UK Performance: Food
0
-10
-5
Z-score
5
10
GB Pirimphos-Methyl(pesticide
Performance forresidue)
all Rounds
Pirimphos-Me
UK
GB
BMTA July 2005: 13
Other
Other
Problem analytes: Arsenic
20
0
10
Frequency
30
FAPAS Arsenic data: Rounds 735-753
UK
GB results only
-10
-5
0
5
Z-score
BMTA July 2005: 14
10
15
20
Problem analytes: Arsenic
100
50
0
Frequency
150
FAPAS Arsenic data: Rounds 735-753
All countries
abs(Z)<=20
-10
-5
0
5
Z-score
BMTA July 2005: 15
10
15
20
UK Performance: Environment
Total polycyclic
aromatics
TotalPAH
800
UK Scheme
Other
mg/kg
600
400
200
0
SO5
SO7
SO6
SO2
Scheme ID
BMTA July 2005: 16
SO1
SO3
UK Performance: Summary
• Broadly comparable to other countries
• No problems unique to the UK
• Some problems (e.g. Arsenic) shared with other
countries
BMTA July 2005: 17
Part 2: Causes of error
• VAM Project KT2.4/3: Causes of poor PT performance
• Aim:
Study "…the principal causes of poor performance in
laboratories and ... the effectiveness of the steps taken
by Participants in PT to improve the reliability of their
results”
• Methodology
– Web-based questionnaire
– Focussed on documented problems identified via PT scores
– Lead questions with follow-up for positive responses
BMTA July 2005: 18
Why study poor scores in PT?
• Why PT?
– PT participants are already committed to quality improvement
– Participants follow up poor PT scores
• Why only poor scores?
– Acceptable scores give poor information about problems
– Correlation of scores with general methodology is not very
effective
– Every good lab has documented problems and corrective actions
BMTA July 2005: 19
Top causes of poor scores
Sample preparation
Equipment problem
Human error
Calibration
Selection of method
Calculation error
Reporting problem
111 respondents
230 causes
BMTA July 2005: 20
Top causes of poor scores
Sample preparation
Extraction/recovery
Dilution to volume
BMTA July 2005: 21
Top causes of poor scores
Equipment problem
Equipment failure
BMTA July 2005: 22
Top causes of poor scores
Human error
Training/experience
Transcription error
Reporting error
BMTA July 2005: 23
Top causes of poor scores
Calibration
Calibration
No reference material
Defective RM
Incorrect procedure
Calibration range
BMTA July 2005: 24
Top causes of poor scores
Reporting problems
Value correct but not in customer units
Incorrect units
Transcription/typographical error
Reporting problem
BMTA July 2005: 25
Top causes of poor scores
Commercial software problem
Spreadsheet
Calculationproblem
error
Spreadsheet user error
Calculator error
Arithmetic error
Value mis-entered
Software mis-applied
Other
BMTA July 2005: 26
Corrective action
Training
New procedures
Revalidation
Method documentation
New equipment
Additional calibration
Method change
Other
RM change
Detailed information showed problem-specific responses
BMTA July 2005: 27
Corrective action - efficacy
• No significant difference in efficacy across different
corrective actions
• Only 50% of actions were marked as ‘fully effective’
• Monitoring of efficacy tended to use local/immediate
methods
– Monitor QC results
– Internal audit
BMTA July 2005: 28
Causes of error: Summary
• Most PT errors were caused by basic lab operations
– Incorrect dilution to volume
– Transcription and reporting errors
– Data and spreadsheet formula entry errors
• Equipment failure is perceived as a problem
• Extraction/recovery problems important
• Commercial software faults caused no problems
• Corrective actions are problem-specific and ‘multifactor’
– More than one action generally required.
BMTA July 2005: 29
Conclusions
• UK analytical labs perform similarly to international
partners, and share similar problems
• The most common cause of PT performance failures are
not technical, but simple human errors such as incorrect
volumetric operations and transcription errors
• Time to look harder at human factors?
• Study web page: via http//www.vam.org.uk - surveys link
BMTA July 2005: 30