330 likes | 491 Views
FDA and Pharmaceutical Manufacturing Research Projects. Jeffrey T. Macher Jackson A. Nickerson Co-Principal Investigators. Presentation Overview. Executive summary Project goals Data collection and synthesis Analysis methodology Findings Development opportunities and constraints.
E N D
FDA and Pharmaceutical Manufacturing Research Projects Jeffrey T. Macher Jackson A. NickersonCo-Principal Investigators
Presentation Overview • Executive summary • Project goals • Data collection and synthesis • Analysis methodology • Findings • Development opportunities and constraints
Executive Summary • We develop statistical models that predict the: • Probability of a facility being chosen for inspection. • Effect of investigator training, experience, and individual effects on the probability of investigational outcomes. • Characteristics and identities of facilities that correlate with the probability of non-compliance. • We present initial results for each of these analyses. • We identify additional opportunities and next steps to create value along with some constraints.
FDA Research Project Goals • Risk-based assessment of FDA cGMP outcomes. • Identify underlying ability of investigators and their training. • Identify underlying compliance of each facility. • Identify attributes (currently recorded by the FDA) that impact inspection outcomes. • Transfer “learning” to FDA.
Progress to Date • Just as new drugs go through • Discovery • Development and • Commercialization…. • Our model and this presentation concludes the discovery phase of our project. • Please think of our model as a “platform” that can be developed to assess a variety of compliance issues.
FDA Project Approach • Compile and link FDA databases. • Estimate the likelihood of various outcomes: • NAI, VAI, OAI; Warning Letters; Field Alerts; Product Recalls. • based on… • compound/product, facility, firm, FDA district, investigator and training derived factors. • in order to … • evaluate the allocation of investigational resources. • inform effectiveness of investigator training and management.
FDA Databases • DQRS (Field alerts) • EES • FACTS (Inspections) – CDER only • Product Listing • Product Recalls • Product Shortages • Facility Registration (DRLS) • ORA Training database • Warning letter database
Data Preparation • Started with FACTS (1990-2003). • Manufacturing facilities only. • Assembled investigator training database: • Identified corporate ownership by plant by year and firms operating at a specific facility each year. • Constructed facility-year data • Added observations for years NOT inspected. • Corrected FEI/CFN mismatches. • Constructed numerous other variables.
Some basic “facts” about the FDA data • Years covered: FY 1990-2003 • Total number of facilities inspected: 3753 • Total number of “Pac codes”: 38,341 • Total number of “Inspections”: 14,162 • Total number of investigators: 783
Empirical Methodology • Inspection • Probability of choosing a facility to inspect. • Detection • Probability of a non-compliance inspection outcome. • Noncompliance • Probability of noncompliance, inspection, and detection. • Detection control estimation.
Inspection • Groups of variables: • Technology variables • Rx Prompt Release Ext or Delayed Rel • Gel Cap Soft Gel Cap Ointment • Liquid Powder Gas • Parenteral Lg. Vol. Parent. Aerosol • Bulk Sterile Suppositories • Industry variables • Vitamins (IC 54)Necessities (IC 55) • Antibiotics (IC 56)Biologics (IC 57) • Inspection decision variables • Ln(Days between inspections) • Surveillance = reason for inspection (0 = Compliance) • Last inspection outcome (1 = OAI, 0 = NAI, VAI) • Years 1992-2003 (binary variables for each year)
D R2 Cumulative R2 Technology variables 12% 12% Industry variables 9 21 Inspection Decision variables 20 51 Year dummy variables ~0 51 Inspection: Explained Variance • Probit analysis of decision to inspect. Omitted categories: Human Drugs (IC 60-66), select technologies, Year dummies 1990-91. Foreign inspection included in analysis but uniquely identifies many inspections and is dropped from the analysis.
Technology Variables:Change in Probability of Inspection ** 99% confidence interval * 95% confidence interval + 90% confidence interval Omitted categories: Not Classified, Bacterial antigens, Bacterial vaccines, Modified bacterial vaccines, Blood serum, Immune serum.
Industry and Inspection Variables:Change in Probability of Inspection Industry Variables Inspection Variables Omitted category: Human drugs ** 99% confidence interval * 95% confidence interval + 90% confidence interval
Probability of Inspection Years Since Last Inspection Days Between Inspections
Detection • Groups of variables • Technology • Industry • Training • Total training days prior to inspection (other than 5 main drug courses) • Drug course 1: Basic drug school • Drug course 2: Advanced drug school • Drug course 3: Pre-approval inspections • Drug course 4: Active Pharmaceutical Ingrediant Mfg. • Drug course 5: Industrial sterilization • Investigator Experience • Number of inspections in the prior 12 months • Number of inspections in the prior 12-24 months • ORA District Office • Investigator Classification • A consolidation of position classifications
DR2 Cumulative R2 Technology variables 0.9 % 0.9 % Industry variables 0.3 1.2 Training and Experience vars. 0.3 1.5 Office and Position variables 1.4 2.9 Investigator effect 4.2 7.1 Detection: Explained Variance • Probit analysis of decision to inspect.
Training and Experience Variables: Change in Probability of Detection1 1Without investigator fixed effects.
ORA Office and Classification Variables: Change in Probability of Detection2 ORA Office Variables Position Variables 2With investigator fixed effects.
Non-compliance • Detection Control Estimation • Relatively new procedure used in academic literature. • Used for assessing tax evasion, EPA compliance, and other applications. • FDA application more complicated than other applications. • Assume three actors: • Facility decides level of compliance. • Inspection decision-maker chooses when to inspect. • Investigator chooses detection or not. • Estimate all three processes simultaneously.
Non-compliance model • Assume inspection decisions are non-random. • Assumption is different from other applications. • Construct a likelihood function that models the probabilities of: • a plant being selected for inspection and • the outcome of the inspection.
Constructing a Likelihood Function The likelihood that facility i is not inspected The likelihood that facility i is inspected L2i = 0 L1i = 1 L2i = 1 L3i = 1 The likelihood that facility i is non-compliant The likelihood that facility i is found non-compliant L3i = 0 L1i = 0 The likelihood that facility i is found compliant The likelihood that facility i is compliant
Likelihood Function • Three probabilities are combined to form the function: • Probability that a non-compliant facility is inspected and detected: L1i=1, L2i=1, L3i=1 • Probability of inspecting and not detecting noncompliance: • probability that the facility is compliant: L1i=0, L2i=1 • probability that noncompliance goes undetected: L1i=1, L2i=1, L3i=0 • Probability that a facility is not inspected in a given year: L2i=0
Estimating the Likelihood Function • Select covariates associated with non-compliance, selection, and detection. • Non-compliance: facility-related characteristics. • Selection: factors currently used in selecting facilities. • Detection: investigator-related factors. • Use a maximum likelihood estimation to find coefficient estimates that maximize the function. • Initialize parameter estimates with results from inspection and detection analyses.
Predicted Level of Facility Non-compliance For 50 Most Inspected Facilities 1 34 26 47 8 36 21 32 18 23 2 44 3 5 19 16 9 29 20 15 7 27 28 41 45 25 35 4 33 38 13 42 14 43 30 37 10 39 50 49 46 22 17 31 12 40 Statistically more noncompliant than the mean facility. Statistically not different from the mean facility. Statistically more compliant than the mean facility.
Immediate Implications • Inspection and Non-compliance • New suggestions for inspection choices. • Use non-compliance analysis to assess risk of any given facility, firm, or technology. • Increase focus on particular facilities and attributes. • Ownership changes. • Mixed strategy inspection plan. • Detection • Use detection analysis to assess quality of investigators and their training. • Focus investigator activities to build and maintain short-run experience.
Broader Implications • Our statistical methods provide a test-bed for asking and answering management and oversight questions. • Further development is needed. • DCE has potentially broad applicability to CDER and other centers at the FDA including CBER, food, etc.. • What facilities are most at risk of non-compliance? • Base-line non-compliance • Technology • Ownership changes, etc. • What manufacturers are more/less prone to non-compliance. • DCE has implications for the type, format, and processing of data to be collected and analyzed.
Development Opportunities • Additional variables can and are being constructed to examine additional issues. • Recall, shortages, supplement filings. • More fine-grain information on technology, manufacturing knowledge, organizational capabilities. • Evaluate manufacturer data collected in our study. • More heavily weight more recent investigations. • Expand to full set of investigators and facilities (requires additional computational resources). • Evaluate endogeneity concerns.
Development Constraints • Software/computer limitation. • Data preparation/man-power. • Funding resources are nearly exhausted. • Teaching.
Current Plan • Document current progress in a white paper. • Further develop data in hand (EES, Shortages, etc.). • We received cooperation from the gold sheets. • Work with you to develop plan for transferring results to FDA. • Look for additional funding sources.