310 likes | 508 Views
Valid Analytical Measurement Studies of Proficiency Testing scheme performance. S Ellison LGC Limited, Teddington. The work described in this paper was supported under contract with the Department of Trade and Industry as part of the Valid Analytical Measurement programme.
E N D
Valid Analytical Measurement Studies of Proficiency Testing scheme performance S Ellison LGC Limited, Teddington The work described in this paper was supported under contract with the Department of Trade and Industryas part of the Valid Analytical Measurement programme
or63 routes to the wrong result ... and what to do about it
Introduction • PT in analytical chemistry • Why study mistakes? • How does the UK do? • PT results compared to international performance • What goes wrong? (and why) • Web-based study of causes of poor PT scores
PT in analytical chemistry - organisation • Typical rounds comprise: • test sample preparation, characterisation and distribution • analysis by participants • data collection and processing • preparation and distribution of the report • Frequency • Typically 6-12 rounds per year • Analytes (measured quantities) • 1-30 per sample per round • Participants • Typically 30-100 per round, but strongly scheme-dependent
The aims of proficiency testing • Primary aim: “To provide the infrastructure for a laboratory to monitor and improve the quality of its routine analytical measurements” • Other aims • Provide information on the state-of-the-art in analytical measurements • Compare performance of analytical methods • Assist a laboratory in the validation of new methods
‘Target range’ • usually a standard deviation ( ) or uncertainty Principle of performance assessment • Observed error • difference between laboratory result (x) and assigned value (X) Compare….. …..using an acceptability criterion
Interpretation of z is consistent across schemesbut depends on Performance Scoring: z-scores x submitted result X assigned value standard deviation for proficiency assessment Z 2 Satisfactory performance 2Z 3 Questionable performance Z 3 Unsatisfactory performance
Typical analytical performance data. Collected foodanalysis data:Various analytes
PT data for benchmarking • Three studies of UK performance • Clinical • Food • Environment • Clinical: Backed by IMEP-17 study (20 analytes: 35 countries) • Food: FAPAS PT scheme data (6 representative analytes; 2000 labs; ca. 250 countries and regions) • Environment: CONTEST and CoEPT project data
UK performance: Clinical UK performance:Consistent withothers; Rarely poor
UK Performance: Food GMO measurement
UK Performance: Food Aflatoxins
UK Performance: Food Pirimphos-Me (pesticide residue) UK Other
Problem analytes: Arsenic All countries
UK Performance: Environment Total polycyclic aromatics
UK Performance: Summary • Broadly comparable to other countries • No problems unique to the UK • Some problems (e.g. Arsenic) shared with other countries
Part 2: Causes of error • VAM Project KT2.4/3: Causes of poor PT performance • Aim: Study "…the principal causes of poor performance in laboratories and ... the effectiveness of the steps taken by Participants in PT to improve the reliability of their results” • Methodology • Web-based questionnaire • Focussed on documented problems identified via PT scores • Lead questions with follow-up for positive responses
Why study poor scores in PT? • Why PT? • PT participants are already committed to quality improvement • Participants follow up poor PT scores • Why only poor scores? • Acceptable scores give poor information about problems • Correlation of scores with general methodology is not very effective • Every good lab has documented problems and corrective actions
Top causes of poor scores Sample preparation Equipment problem Human error Calibration Selection of method Calculation error Reporting problem 111 respondents230 causes
Extraction/recovery Dilution to volume Top causes of poor scores Sample preparation
Equipment failure Top causes of poor scores Equipment problem
Training/experience Transcription error Reporting error Top causes of poor scores Human error
Calibration No reference material Defective RM Incorrect procedure Calibration range Top causes of poor scores Calibration
Reporting problems Value correct but not in customer units Incorrect units Transcription/typographical error Top causes of poor scores Reporting problem
Commercial software problem Spreadsheet problem Spreadsheet user error Calculation error Calculator error Arithmetic error Value mis-entered Software mis-applied Other Top causes of poor scores
Corrective action Training New procedures Revalidation Method documentation New equipment Additional calibration Method change Other RM change Detailed information showed problem-specific responses
Corrective action - efficacy • No significant difference in efficacy across different corrective actions • Only 50% of actions were marked as ‘fully effective’ • Monitoring of efficacy tended to use local/immediate methods • Monitor QC results • Internal audit
Causes of error: Summary • Most PT errors were caused by basic lab operations • Incorrect dilution to volume • Transcription and reporting errors • Data and spreadsheet formula entry errors • Equipment failure is perceived as a problem • Extraction/recovery problems important • Commercial software faults caused no problems • Corrective actions are problem-specific and ‘multifactor’ • More than one action generally required.
Conclusions • UK analytical labs perform similarly to international partners, and share similar problems • The most common cause of PT performance failures are not technical, but simple human errors such as incorrect volumetric operations and transcription errors • Time to look harder at human factors? • Study web page: via http//www.vam.org.uk - surveys link