470 likes | 654 Views
The Role of the Statistician and the Data Manager in Clinical Trials Dr Richard Kay, Consultant , RK Statistics Dr Andy Richardson, Consultant, Zenetar Tuesday 5 th Feb 2013. Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development. Statisticians’ Role.
E N D
The Role of the Statistician and the Data Manager in Clinical Trials Dr Richard Kay, Consultant , RK Statistics Dr Andy Richardson, Consultant, Zenetar Tuesday 5thFeb 2013 Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development
Statisticians’ Role • Contribute expert input to the study design • Type of study (looking for differences or to show similarity?) • Sample size calculation • Dealing with complexity (multiplicity, interim analysis, missing data …) • Develop and undertake the statistical analysis of the data • Protocol statistical methods section • Statistical Analysis Plan • Statistical Report.... • Other • Support IDMCs • Papers/presentations.... • Regulatory questions often relate to statistical issues
Planning and Setup • Statistical Issues in Design • Superiority, equivalence or non-inferiority • How many patients • Method of randomisation • Statistical methods for analysis • Dealing with multiplicity • Interim analysis • Adaptive designs
Superiority, Equivalence or Non-Inferiority • Show that Drug A is better than Drug B or that Drug A works (compare to placebo) - Superiority • Demonstrate that Drug A and Drug B are ‘equally effective’ - Equivalence • Demonstrate that Drug A is ‘at least as good as’ Drug B - Non-Inferiority
How many patients - Superiority • Level of significance (usually 5%) • Power required (ICH E9 recommends >80%) • Level of effect we are looking to detect (clinically relevant difference) • Patient to patient variation for continuous endpoints then out pops N! • This is a fixed sample size design
Method of Randomisation • Simple randomisation • Block randomisation – block size • Stratified randomisation • how many factors • which factors and at what levels
Statistical Methods for Analysis • Endpoint type • continuous • binary (responder/non-responder) • ordinal (none/mild/moderate/severe) • time to event • Adjustment for baseline risk factors (covariates) – adjusted analysis, analysis of covariance • Superiority, Equivalence or Non-inferiority (methods of analysis very different) • Sensitivity analyses
Dealing with Multiplicity • Single primary endpoint or several • Multiple dose groups • Implications for multiple testing and adjustment of α • Avoiding adjustment • Hierarchical (or closed) testing • Strategy for secondary endpoints
Interim Analysis • Offers opportunity to stop early for several reasons • Overwhelming efficacy • Futility • Pre-planning required + careful management of unblinding • Impacts on α
Adaptive Design • Can re-visit the sample size calculation as the trial data accumulates • If done in a blinded way, no price to pay for α • If done in an unblinded way then this constitutes an adaptive design • This does impact on α and adds complexity to the design and analysis • Can also use adaptive design thinking to modify trial in other ways • Drop a treatment arm
Statistical Methods Section of the Protocol • What should it contain? • Justification of sample size • Primary and secondary endpoints – clear delineation • Analysis datasets (intention to treat / per protocol) • Handling missing data • Methods of statistical analysis of the primary endpoint in detail (including adjustments for covariates); broad outline of methods for secondary endpoints • Dealing with multiplicity • Presentation of safety data • Methods of interim analysis
Role of the Data • The data is the critical asset • Critical asset – for the study, for the company, for the regulator • Accuracy and Confidence • Must meet study objectives • Must comply with regulatory requirements • Observations must be accurate and confirmed • Data is required to support • Observations/measurements of the trial • Conduct of the trial
Role of the Data • Observations/measurements of the trial • Planned : e.g. Research/study results • Not Planned: e.g. safety data (event ) • Conduct of the trial • Data to confirm that the observational data has been collected and processed consistently • Data to confirm who operated on the data, and when • Requirement to ‘recreate’ the study
FDA ‘ALCOA’ Principles • Attributable • Data records must indicate who recorded the information • Data must remain under investigator control • Legible • Must include subject data, meta data and audit trails • Available in human readable form • Contemporaneous • Recording as close to observation/measurement as reasonably possible • Audit trails to provide evidence of timing • Original • Original data or accurate transcription of original • Accurate • Data must remain unaltered when saved to the database
Data Managers’ Role • Three Data Management ‘C’s • Collect the data • Collate the data • Confirm the data • Role of CDM • …. to organise and ensure the collection of accurate data from the trial, to capture the data on a database, to validate and correct the data, and to provide ‘clean’ data to the statistician in a form that will facilitate the statistical analysis • Principles of Clinical Research
Planning and Setup Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
CDM Role and Responsibilities • Key Tasks • Design & build study specific database/data collection tools and supporting technologies • Review and resolve inconsistencies in the data • Confirm data related compliance requirements • Manage and maintain the CDM technical infrastructure
Data Management • Design & build a study specific database/data collection tools and supporting technologies • Specify and develop the required study data review methods and resolution methods • Protocol (Study Flowchart) • CRF • Database
Operational Implementation Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
Operational Implementation • Study database and process validation • Test data entry, checks, exports etc. • Confirm processes (SOPs, alerts etc) • Document (Study Validation Plan) • Load/enter study static data • Site information • Dictionary/extended codelists • Randomisation list • Authorise Users • Study/Project team • Investigators
Study Execution Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
Data Entry Process • Receipt and management of study documents • CRFs, DCFs, etc. • Populating the clinical trial database • Entry by forms, loading data • Identification of some types of inconsistencies • Correction of some types of errors • Confirmation of accuracy of the transcription • Double data entry • Entry and blind verification • Entry and interactive verification • Single entry with review • Single entry without review
Data Review Process • Reviewing / checking the data for consistency and accuracy • Checks for • Variables of the correct type • Variables contain valid values • Data are reasonable (within ranges) • Eliminate duplicate values • Confirmation or resolution of missing data • Uniqueness of key variables • Chronology is consistent • Implied or explicit logic is consistent • Implied or explicit existence of related data is confirmed
Data Review Process • Inconsistencies are documented, resolved or confirmed • Data Clarification Forms • Document the inconsistency • Request confirmation or resolution • May suggest resolution • Investigator receives and responds to DCF • CRF plus associated DCFs represent the final state of the study data in the database.
Data Coding • Categorising data for • Use in subsequent analysis (Yes=1, No=0) • Consistency with specialist lists (e.g. microbiology) • Resolving variably reported original data into a standard form • Coding using Expert terms • Dictionaries for medical history, adverse events, medications • MedDRA, WHO Drug Dictionary, ICD-9/10, etc.
Adverse Event Reconciliation • Study CRF/database will contain • (n) Adverse Events, of which (m, m<n) events will have been reported under expedited reporting rules (ie: Serious Adverse Events) • Serious Adverse Events are reported immediately, managed by a pharmacovigilance group, and are stored in specific pharmacovigilance systems, and in the study database. • Study AE will be documented on the CRF, stored in the study database, and are only available to the study team some time after the event.
Adverse Event Reconciliation • Study AE database and the Serious AE database must be consistent • Number of SAEs, • Subject • Reported Event • Duration (Start Date, End Date) • Outcome etc. • Relationship to study drug • Typical Method • Compare all data in each database for each SAE, resolve all inconsistencies
Close, Archive and Reporting Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
Database Close & Locking Process • Closing a study database • Confirmation all data originally collected is present • Confirmation that the ‘data quality’ is acceptable (error rate) • Confirmation that all data has been subject to the documented procedures • Report on the omissions, inconsistencies, issues discovered or remaining in the data • Removing access to prevent further data modification • Unlocking a study database • Formal process to enable post-lock (blind broken) data modifications to occur
Regulatory Framework & Infrastructure Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
Regulatory Framework & Infrastructure • DATA RELATED • ICH Good Clinical Practice • E6: Good Clinical Practice • E9: Statistical Principles • TECHNOLOGY • Computer Systems Validation • Electronic Signatures • CFR Title 21 & Guidance for Industry • PRIVACY • Patient Identifiable Data • Data Protection Regulations
Computer Systems Validation • Definition • "Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through the software can be consistently fulfilled". • FDA General Principles of Software Validation: Final guidance for Industry and FDA Staff • Principles • Specify intended use and user requirements, make sure and verify that the software meets the requirements through proper design, implementation and testing and maintain proper use through an on-going performance program
Computer Systems Validation • Installation Qualification • ‘Is the software application properly installed, are all its physical and logical requirements met, and is its platform adequately configured for user access in the work process’ • Operational Qualification • ‘Does the software application work as intended just above, just below, and at the operational limits set in its design specification’ • Performance Qualification • ‘Does the software application system perform as intended to meet user requirements specifications in a simulated work process environment’
Andy Richardson andy.richardson@zenetar.com Practical Guide to Clinical Data Management Susanne Prokscha http://www.crcpress.com
Statistical Review & Reporting Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training
Statistical Analysis Plan • Expansion of the statistical methods section of the protocol • Precise detail on the analysis and presentation of the data • Table shells
Analysing the Data and Reporting • Pre-programming and dry-runs • Blind Review (before breaking the randomisation code) • Choice of analysis sets, amount of missing data, how to deal with small centres, …
Analysing the Data and Reporting • Clean database / database lock • Attach randomisation code • Complete analysis and outputs (analyses, tables, figures and listings) • Work with Medical and Medical Writing on Integrated Report
IDMCs • Increasingly used for monitoring trials in an ongoing way - primarily in long term mortality trials • Enables sponsor to remain blind • Look at safety on a regular basis • Look at unblinded interim analyses for efficacy • Make recommendations on the basis of these ongoing data/results • Board consists of ≥ 3 members, one must be a statistician
Presenting Results • Publications; still many mistakes and bad practice • CONSORT statement (see eg Begg, Cho et al (1996) JAMA) • Conference presentations and posters; frequent horror stories! – don’t get bad press because of this • Involve a statistician
Regulatory Submissions - Guidelines • ICH, E9: ‘Statistical Principles for Clinical Trials’ • ICH, E10: ‘Choice of Control Group in Clinical Trials’ • Points to Consider papers (EU) • Therapeutic area specific Guidelines • Various FDA statistics guidelines on specific topics
Regulatory Submissions - EU Points to Consider • Clarify and expand on issues raised in ICH E9 • Adjustment for Baseline Covariates • Application with 1) Meta Analysis; 2) One Pivotal Trial • Switching between Superiority and Non-Inferiority • Multiplicity Issues in Clinical Trials • Missing Data • Choice of Non-Inferiority Margin • Data Monitoring Committees • Confirmatory Clinical Trials with Flexible Design and Analysis Plan
Statistical Thinking for Non-Statisticians in Drug RegulationBy Richard Kay ISBN 9780470319710 RRP £50.99/€61.20 With 10% discount £45.89/€55.08 Quote promotion code: VA259 Order online at www.wiley.com Richard Kay richard.kay@rkstatistics.com