250 likes | 395 Views
Definitions Key Objectives Defect Universe Test Coverage Test efficiency. LeanTest key: Test coverage analysis powered by traceability. Case Studies Faulty boards DPMO estimation Test repeatability Real contribution - AOI. Christophe LOTZ
E N D
Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency LeanTest key: Test coverage analysis powered by traceability Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Christophe LOTZ christophe.lotz@aster-technologies.com ASTER Technologies Conclusion BTW2012 IEEE 11th International Board Test Workshop
Targets: Key objectives Our targets are to provide tools that: • Create an effective environment to continually improve the delivered quality of manufacturing processes. • Assist in reducing costs of assembly, test, rework, scrap and warranty. • Help improve line utilization and reduce cycle time. • Allow manufacturers to better prioritize the deployment of constrained resources. • Allow manufacturers to benchmark their DPMO rates to others in the industry. … regardless of board complexity. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Test coverage and traceability • Good products must be defect-free and cheap. • How to detect or prevent all faults on the product so that only good products are shipped? • Test coverage is a key metric as it will be the quality warranty and the main driving factor for LeanTest. • This paper describes how traceability tools should be used in order to improve test coverage understanding. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Defect Detection Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Defect Prevention Conclusion
Identify the faults that can occur. Material • X-Ray In-Circuit Solder (unpowered) • Insufficient • Excess • Cold Solder • Marginal Joints • Voids • Polarity (PCAP) • Dead Part • Bad Part • In-System Programming • Functionally Bad • Short/Open on PCB • Missing • Gross Shorts • Lifted Leads • Bent Leads • Extra Part • Bridging • Tombstone • Misaligned • Shorts • Open • Inverted • Wrong Part JTAG • At-speed memory tests • At-speed interconnect • Fault Insertion • Gate level diagnosis • Polarity • AOI Placement (unpowered) Defect Universe Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Missing components Wrong value Misalignment Open Circuits Tombstone Broken components Incorrect Polarity Short circuits Insufficient solder Defect Universe • Typical manufacturing defects: • We need to group defects into categories, to understand what defects can be captured by a particular test strategy. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Excessive solder Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion Material (Supply chain) Placement Solder
Test coverage • The ability to detect defects can be expressed with a number: coverage. • Each defect category fits with its test coverage: Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Test coverage by defect category • For each category (Material, Placement, Solder) of defects (D), we associate the corresponding coverage (C). • The test efficiency is based on a coverage balanced by the defects opportunities. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency DM CM + DP CP + DS CS + DF CF Effectiveness = DM + DP + DS + DF Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion Coverage We need a better coverage where there are more defect opportunities! DPMO
Test coverage by defect category • Each test technique brings a certain ability to detect the defects defined within ‘defect universe’. • No single solution is capable of detecting all the defects. • Good coverage = combination of tests. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency M P S F AOI Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI M P S F ICT F S M P FT Conclusion P S F M
Case 1: Faulty boards at system level • Electronic plants, in charge of board integration, often discover a significant amount of defective boards at system test. • How is it possible to get failures at system level if we only buy good boards? • The defect appears during packing and transportation (vibration, extreme temperatures, moisture). • The defect is a dynamic problem which is revealed by the integration of the board in the complete system. • The reality is usually more simple… Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI AOI ICT BST FT Conclusion
Case 1: Faulty boards at system level • If the board is failing at system test, it is usually because the escape rate (or split) is higher than expected. • There are only two possibilities: • The combined coverage is lower than optimal. • The DPMO figures are higher than expected. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Good Products shipped FPY Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Slip Bad Pass Test False reject Good Fail Products repaired FOR Conclusion Bad
Case 1: Faulty boards at system level • The auditing conclusions were: • Wrong or inadequate coverage metrics are produced: Example: confusion between accessibility and testability; coverage by component only - without incorporating solder joint figures ; Over optimistic report (marketing driven report), • Wrong DPMO figures due to limited traceability or incorrect root cause analysis (Example: confusion between fault message and root cause/defect). Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Coverage estimation Coverage measurement Conclusion Selected strategies
Case 2: DPMO estimation • Going beyond solving surface issues. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Test coverage Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Solder Material Insufficient solder Short Open Missing components Conclusion Polarity Broken leads Tombstone wrong value DPMO Misalignment Placement
Case 2: DPMO estimation • The weighted coverage (with DPMO) is a key factor to estimate the production model. • We need an accurate value for DPMO if we want realistic production models. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 2: DPMO estimation • Production model in a test line Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency AOI Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI ICT Conclusion FT
Case 2: DPMO estimation • Basic analysis uses average numbers coming from iNemi or the PPM-Monitoring.com web site. It does not reflect the reality, but it is much better than considering all defects as equally probable! • The best approach is to use the traceability database in order to extract a table including parameters such as partnumber, shape, mounting technology, pitch, number of pins, function/class, DPMO per category (MPSF). Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Database Repair Conclusion Assembly machines Test, inspection & other machines
Case 2: DPMO estimation • Define data collection methods around existing IPC standards. • IPC 9261 In-Process DPMO and Estimated Yield. • IPC 7912 Calculation of DPMO and Manufacturing Indices for PCBAs. • Define data stratification and classification methods. • Combine the data into a single database: • DPMO for Material (Part number). • DPMO for Placement (Package type). • DPMO for Soldering (Reflow & Wave). It requires good cooperation between test and quality services. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 3: DPMO estimation • Range and standard deviation for any DPMO statistic. • Compare actual yield to estimated yield: • By test step. • Full test line. • Correlation of test coverage/strategy to DPMO rates. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 2: DPMO estimation • During test coverage analysis, TestWay uses various algorithms to estimate the DPMO. • Same Part Number, • Same shape for placement DPMO, • Same pitch and number of pins. • With an accurate DPMO representation, it is possible to: • Estimate the yield and the escape rate. Two key factors used to select the best Contract Manufacturer or EMS - DPMO figures per EMS site. • Identify the real overlap for test/inspection optimization. DfT becomes one of the principal contributors in cost reduction. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 3: Test Repeatability • During first production run, we selected a set of boards where a SPC analysis has been conducted in GR&R context. • Gage R&R (Gage Repeatability and Reproducibility) is the amount of measurement variation introduced by a measurement system,which consists of the measuring instrument itself and the individuals using the instrument. A Gage R&R study is a critical step in manufacturing Six Sigma projects, and it quantifies : • Repeatability – variation from the measurement instrument. • Reproducibility – variation from the individuals using the instrument. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 3: Test Repeatability • Quality and traceability analysis helps to compute the classic Cp, CpK … and CmC. • CmC means “Calibration and Measurement Capability”. CmC = Tolerance / k (k = 6 for critical components). • A test which is not repeatable cannot claim to qualify a component. So CmC is used to weight the Correctness coverage. • In addition, the Failure Mode and Effects Analysis (FMEA) gives the criticity per component which limit the oversized test tolerance. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion
Case 3: Test Repeatality • Passive measurements • Correct value: Value is tested at 100%. • Minor deviation: Value is tested at 95%. • Medium deviation : Value is tested at 50%. • Major deviation: Value is not tested. • Incorrect value: Component is not tested. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI For more accuracy: Compare CAD value and tolerances against minimum and maximum tested values Conclusion 95% 50% 0%
Case 4: Real contribution of AOI/AXI • AOI and AXI are inspection techniques which are checking for deviations. • When deviation is big enough, it should become a defect. • When a test line includes an inspection machine and an electrical tester (ICT, BST), it is difficult to agree on test contribution. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI 95% 47% 53% Conclusion BST FT AXI
Case 4: Real contribution of AOI/AXI • With a traceability system that collects defects/repair information in real time, we are able to record that a fault has been detected and how it has been diagnosed (ie: Root cause analysis). • We compare the defects that have been detected with ICT and FT against the defects detected by AOI in order to adjust real coverage. Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Database Conclusion Test Coverage Analysis Diagnosis/Repair FT AOI ICT
Conclusion • Continual reassessment of capability metrics. • Improved accuracy of quality estimations. • Enhanced defect detection rate by increasing the understanding of test coverage. • Reduced escape rate (bad boards to the customer). Definitions • Key Objectives • Defect Universe • Test Coverage • Test efficiency Case Studies • Faulty boards • DPMO estimation • Test repeatability • Real contribution - AOI Conclusion Zero-defect road…