220 likes | 229 Views
QA in healthcare labs encompasses activities aimed at achieving and maintaining specified quality. This text explores the roles of QA, types of errors, accuracy and precision, and quality control techniques in laboratory testing.
E N D
Quality Assurance QA is defined as: the practice which encompasses all activities, procedures, formats of activities directed towards ensuring that a specified quality or product is achieved or maintained. QA involves in every step in the analysis process from the initial ordering of a test and the collection of the patient sample (pre analytic), analysis of the sample (analytic), and finally the distribution of result to the proper destination. QA program involves every person in the lab. From the director to the lab helpers, also includes every one who has contributed to the enterprise such as the phlebotomy team and data processors.
Roles of QA • Select the most accurate and precise analytical methods, in a time period that is most helpful to the physician. • Adequately train and supervise the activities of the lab personnel and conduct them with continuing education sessions. • Good instruments and institute a regular maintenance program. • Good quality control program, which is concerned with the analytical phase of QA. • Make available printed procedures for each method, with explicit directions, an explanation of the chemical principles and a listing of the reference values.
Types of errors Analytical, determined or systematic error: Which is usually due to an analytical factor (instrumental, operational or errors of method) this bias (inaccuracy) can be determined and corrected. Random error: Lab also subjected to imprecision or random variability
Accuracy and Precision Accuracy: is defined as the extent to which the mean measurement is close to the true value. The accuracy of a method is generally reflected by it is ability to produce the values of reference samples of known concentration. Precision: the variation of results when numerous tests are performed multiple time, may be graphically depicted by a frequency curve, which gives symmetric distribution. (below figure).
Then 68% of values fall within (1SD) around the mean. 95.5% of values fall within (2SD) around the mean. 99.7% of values fall within (3SD) around the mean. 0.3% of values fall within (3SD) around the mean.
standard deviation x is the mean xi isan individual measurement ∑ is the operation of summation n is the number of xi values in the group.
Mean: the average values (the central point). Median: the middle value within the range. Mode: the most frequency occurring value. Note: The degree of precision of method is best expressed in terms of SD, the greater SD the less precise method, due to the larger deviation from the mean.
Coefficient of variation The coefficient of variation (CV): Is the S.D expressed as a percentage of the mean (average) and is more reliable means for comparing the precision at different concentration level of units. CV = ( S/X )100% The precision of method varies inversely with the CV; the lower the CV the greater the Precision. Range of CV < 5-10% to consider the method is precise. Variance: less used to express precision, equal squared of SD= (SD)2
Example • The following results was obtained for a sample when assayed several times, • 17, 18, 19, 20, 20, 20, 21, 21, 22, 22 • Find the mean, SD, CV, Variance? • Solution: • Calculate mean= Σ x \n= Σ (18+20+21+17+22+19+20+20+21+22)\10=20. • Calculate the difference of each result from the mean. • Square the difference. • Σ Of squares .
Σ(Xi-X)2 =24 SD =1.6 unit CV = 8.0% Variance = 24\9 unit2
Control specimens The simplest, most straightforward way to check the reproducibility of a method is by including control specimens in the run These are samples for which the correct answer is known. If control serum or urine are included with patient's samples, and observed results are the expected results, we can feel confident about the assay and can probably safely assume that the results on the patient's samples are also correct.
Control specimens Control specimens are often consisting of lyophilized pool sera or urine, which may be in the normal rang or abnormal when reconstituted. QC specimens may be commercial prepared or by the lab. By pooling plasma, freezing of used when needed. A quality control product usually contains many different analytes. For example a general chemistry control can contain any number of chemistry analytes including glucose, protein, electrolytes and liver enzymes.
Characteristics of good control The composition should be as similar to the patient sample. The concentration should be stable under storage for long period of time. Material should be low vial-to- vial variability. After vial has been opened and material prepared, it should remain stable for the period of use. The material should be reasonably priced (Not expensive). The material should be available in large quantities.
Standard specimen Some Quality Control Techniques QC can be divided into two major types: 1-Internal QC (intralaboratory QC): This primarily monitors the day-to-day performance of laboratory results, precision. 2- External QC (inter laboratory QC): which primarily monitors the accuracy of the results. A substance that can be accurately weighed or measured to produce a solution of an exactly known concentration.
Shift Is defined as a drift of values from one level of the control chart to another, which maybe sudden or gradual and may be due to failure to recalibrate when changing lot numbers of reagent during an analytical process.
Trend • Is the continuous movements in one direction over six or more consecutive values. • Trends are difficult to detect without continual charting, and may start on one side of the mean and move across it or it can occur entirely on one side of the mean. • This problem is usually due to deterioration of reagents or light source of the instrumentations.
Dispersion • Dispersion: Values may be within the acceptablerange (2s and 3s) but are unevenly distributed outside the ±1s limits. This is indicating a loss of precision and due to random error.
Definition of terms • Sensitivity: method sensitivity refers simply to the lowest level of analyte that can be detected by a given method with low CVs. • Specificity: refers to how specific a test is for a certain substance without interferences. • Bias: the difference between the true value and value obtained. • Concentration: a measure of the amount of dissolved substance per unit of volume. • Lyophilized: freeze-dried.
Definition of terms • Analyte: the constituent or characteristic of the sample to be measured. • Assay: to analyze a sample of a specimen to determine the amount, activity, or potency of a specific analyte or substance. • Out of control: indicates that the analysis of patient samples is unreliable. • Run: a period to time or series of measurements within which accuracy and precision of the measuring system are expected to be stable. • Range: the difference between the largest and smallest observed value of a quantitative characteristic or statistical limits.