2.02k likes | 3.88k Views
Laboratory Proficiency Test . Akhmad Sabarudin 1,2 E-mail: sabarjpn@gmail.com. 1 Department of Chemistry, Brawijaya University 2 Collaborative Research Group for Advanced System and Material Technology (CRG-ASMAT). Scientific Based. Accreditation. NEUTRAL. Proficiency Test.
E N D
Laboratory Proficiency Test Akhmad Sabarudin1,2 E-mail: sabarjpn@gmail.com 1 Department of Chemistry, Brawijaya University 2 Collaborative Research Group for Advanced System and Material Technology (CRG-ASMAT)
Scientific Based Accreditation NEUTRAL Proficiency Test Internal Quality Control Validation Method Quality Assurance System Laboratory should……. Proficiency Test Quality Assurance System
Proficiency Testing ? • is a mechanism to ensure standardized testing across laboratories and evaluate your lab’s performance in comparison to peer groups performance. - Uses commercially available materials & evaluations • is an essential aspect of any laboratory operation providing a means of assessing analytical performance compared to other laboratories using the same method and instrument. • Proficiency Test (PT) can also be called: • Inter Laboratory Comparisons (ILC) • External Quality Control (EQC) • Round Robin Test
Proficiency Testing ? What we are doing right (Passing test) What we are doing wrong (failing test) One good result does not make a laboratory supreme. The challenge is to repeat this performance One bad result does not make a laboratory bad. The challenge is to prevent that it repeats
Purpose of Proficiency Testing • To determine the competence of individual laboratories to perform specific tests or measurements • To monitor the performance of laboratories overtime
Proficiency Testing may be used to…. a) Determinethe performance of individuallabs for specific tests or measurements and to monitor labs’ continuing performance; Identifyproblems in laboratories& initiateremedial actions whichmayberelated to, for example, individual staff performance or calibration of instrumentation etc; c) Establish the effectiveness & compatibility of new test or measurement methods & similarly to monitor established methods d) Provide additional confidence to lab clients; e) Identify inter-laboratory differences;
General steps/process of proficiency testing 2. Sample delivery 3. Carrying out testing by participating laboratories 1. Preparation of samples Proficiency Testing Provider 4. Submitting measurement results to PT provider 6. Submission of PT results to participating Laboratories 5. Investigation of participating laboratories results by PT Provider • Performance Assessment: • Satisfactory • Questionable • Unsatisfactory Data analysis Statistical and interpretation Data
Provider of Proficiency Test (PT Provider) • Provider must have experience with the particular type of tests items • Provider must have competence to make necessary measurements • Provider must do performance assessment of the results submitted by participating laboratories
Sample and Sampling Keep in mind : Analysis starts with sampling • When, where and how to collect samples • Sample transportation to participating laboratories • Sampling equipment • Sample containers • Sample-treatment procedures (drying, mixing, etc. prior to measurements) • Storage of samples
Sample and Sampling Regarding the sample… POPULATION (N) IS THE SAMPLE REPRESENTATIVE? SAMPLE (n)
Sample and Sampling Regarding the inference… POPULATION (N) INFERENCE IS THE INFERENCE GENERALIZABLE? SAMPLE (n)
Sample and Sampling Simple Random sampling Sample Population Taken from simple population
Sample and Sampling Systematic sampling • Select picking interval e.g every fifth. • Choose randomly one among the first five (or whatever the picking interval is). • Pick out every fifth (or whatever the picking interval is) beginning from the chosen one.
Testing by participating laboratories Analyst/ QC Technician Analyst /QC Technician should be familiar with testing methods and instrumentations • Ensure sufficient staffs, their experience and competence • Assure staff receives sufficient training, as needed. • Maintain training records
Testing by participating laboratories PROFICIENCY TESTING SCHEMES (example)
Testing by participating laboratories Analyzed parameters (example) Quantitative Qualitative
Testing by participating laboratories Most importantly : The method must be validated Choice of Methods • Participants normally allowed to use method of choice, but special methods may be required by PT Provider • PT provider must request details of method used, to allow analysis and comment • Often require a statement in instructions that labs must use routine method
Testing by participating laboratories Experimental Design
Data Analysis Evaluating results • QUANTITATIVE - Mean ± Standard Deviation (s) - Evaluating Accuracy - Evaluating Precision - Evaluating Trends (graph/diagram) • QUALITATIVE - Intended Response - Majority of Results
Data Analysis • QUANTITATIVE Mean ± Standard Deviation (s) The mean, X, is the numerical average obtained by dividing the sum of the individual measurements by the number of measurements The absolute standard deviation, s, describes the spread of individual measurements about the mean X 100% Relative Standard Deviation:
Data Analysis • QUANTITATIVE Mean ± Standard Deviation (s) Example of the report KISS ! (Keep it short and simple)
Data Analysis • QUANTITATIVE Accuracy and Precision Accurate but not precise Accurate and Precise Neither accurate nor precise Precise but not accurate
Data Analysis • QUANTITATIVE Accuracy and Precision Exactness of an analytical method Accuracy Degree of repeatability of an analytical method Precision Repeatability Uncertainty Reproducibility
Data Analysis • QUANTITATIVE Evaluating accuracy Accuracy: the closeness of agreement between a test result (A) and the accepted reference value (B) A: an observed, calculated, measured or estimated value B: the true value Standard/Certified Reference Material (SRM/CRM), Assigned Value, Reference, formulation .... Assessing Accuracy: % Error x 100 % Accuracy = 100 - % error
Data Analysis Re-examined -5% 5% re-examined • QUANTITATIVE Evaluating accuracy Example: acceptable error +/- 5% Acceptable
Data Analysis • QUANTITATIVE Evaluating precision Precision S>>>>> or Sr >>>>> Precision: Thecloseness of agreement between independent test results obtained under stipulated conditions Assessing precision Standard Deviation (s) Relative Standard Deviation (Sr) Sr x 100% Precise
Data Analysis • QUANTITATIVE Evaluating precision Precision Repeatability • Precision under similar conditions; • Srwithin day obtained from each analyst Reproducibility • Precision under different conditions; • Srbetween day obtained from different analyst
Data Analysis • QUANTITATIVE Accuracy and Precision ERROR ?? Noise in measured value
Data Analysis • QUANTITATIVE Accuracy and Precision Example: Sources of Error sampling Representative sample homogeneous vs. heterogeneous preparation Loss of sample Contamination (unwanted addition) analysis Measurement of Analyte Calibration of Instrument
Data Analysis • QUANTITATIVE Accuracy and Precision • Can all errors be controlled? • What are some possible things that can be done to minimize errors?
Data Analysis • QUANTITATIVE Accuracy and Precision Types of (experimental) Errors • Systematic Error Result of an experimental “mistake” • Known cause , i.e: • Skill of Operator or analyst • Calibration of instrument, etc.
Data Analysis • QUANTITATIVE Accuracy and Precision • Systematic Error • This error can be corrected/ controlled when causes of error are determined, i.e: (a) calibrating all experimental tools and/ or instruments (b) improving skill of analyst, operator etc. (c) Cleaning all glassware, bottles, etc before doing experiments (d) etc… ?????
Data Analysis QUANTITATIVE…..Accuracy and Precision Types of (experimental) Errors • Random error • Unpredictable, non-deterministic • Result of • Limitations of measuring tool • Random processes within system • Environmental effect (?), etc • Typically cannot be controlled • Use statistical tools to characterize and quantify Multiple trials help to minimize
Data Analysis QUANTITATIVE…..Accuracy and Precision Types of (experimental) Errors • Systematic errors → accuracy • Random errors → precision
Performance Assessment (PT Scoring System) z-scores z-scoresto compare data values measured on different scales A z-score reflects how many standard deviations above or below the mean a raw score is The z-score is positive if the data value lies above the mean and negative if the data value lies below the mean.
Performance Assessment (PT Scoring System) What the Z-Score mean ? • IF YOUR IN, YOUR GOOD - Repeat this performance • IF YOUR OUT, YOUR INCOMPETENT - Probably, but maybe not
UNSATISFACTORY • Be Proactive - determine why - prevent 2nd unsatisfactory performance • Initiate Corrective Action
UNSUCCESSFUL • Stop Testing • Investigate and Determine cause of failure • Initiate Corrective Action • Develop skill of the staff (Analyst, QC technician, etc) • Evaluate Corrective Action • Documentation • Request approval to resume testing
Do a Root Cause Analysis A rigorous systematic approach to answering: - What happened & Why? • Said simply, Root Cause Analysis is asking why the problem occurred, and then continuing to ask why that happened until we reach the fundamental process element that failed.
How to Write Analytical Report Akhmad Sabarudin1,2 E-mail: sabarjpn@gmail.com 1 Department of Chemistry, Brawijaya University 2 Collaborative Research Group for Advanced System and Material Technology (CRG-ASMAT)
The three “C”s (C3) principle A well-written analytical report possesses the following three “C”s: • Clarity • Conciseness (simplicity) • Correctness (accuracy) KISS ! (Keep it short and simple) - Easy to understand -
Content of report : (1) General • Costumer Identity • i.e : Name, institution/company • Sample Identity • i.e • Name of sample • Form of sample: solid, liquid, ……. • Color of sample • Sample`s code • Picture of sample (if required) • Test performed (if requested by costumer) • Date • i.e • Sample receiving date • Test performing period
Content of report : (2) Purpose/ Aim • To do something…………….. • i.e • To measure ……….. • To check …………... • To investigate …….. • To determine ……… • To verify …………… • To compare ……….. • To calculate ………. • Etc……………………. (3) Background • Provide the necessary background information to put your work into context (in short) • Explain concisely problem or issue examined/tested
Content of report : (4) Results • Inform costumers of the results accurately, precisely, concisely, and specifically • - Use tables or figure to summarize data • - Show the results of statistical analysis (if necessary) • - Test method/ Reference: i.e ATSM, AOAC, ISO……… • - Limitation of the method (if necessary) - Indicate the statistical tests used with all relevant parameters mean ± SD - Give numerator and denominators with percentages 40%…(w/w)…(v/v)….(w/v)… - Give unit for all analysed parameters (mg/kg….)
Content of report : (5) Short discussion/conclusion Concisely Describe: • How the results relate to the purpose/ aim • Possible interpretations of the findings (6) Report should be approved and signed by a person who responsible to the analytical results Note: some examples of the analytical reports will be discussed in the workshop (in-house training)