130 likes | 144 Views
Structural uncertainty from an economists’ perspective . Laura Bojke Centre for Health Economics, University of York. Structure of the presentation. Why structural uncertainty is a problem in decision modelling What is structural uncertainty Some examples
E N D
Structural uncertainty from an economists’ perspective Laura Bojke Centre for Health Economics, University of York
Structure of the presentation • Why structural uncertainty is a problem in decision modelling • What is structural uncertainty • Some examples • Methods available to characterise structural uncertainty • Outstanding issues/discussion points
Uncertainty in decision analytic models • Uncertainty is pervasive in any assessment of cost-effectiveness. • Need to produce accurate estimates of cost-effectiveness and assess if current evidence is a sufficient basis for an adoption decision. • Much of the focus on uncertainty in decision analysis has been on parameter uncertainty • Other forms of model uncertainty exist and these have received much less attention in the HTA literature . • The issue of structural uncertainty in particular is under researched.
What is structural uncertainty? • Aside from parameter and methodological uncertainties, other sources of uncertainty include the different types of simplifications and scientific judgements that have to be made when constructing and interpreting a model of any sort. • These have been classified in a number of different ways but can be referred to collectively as structural uncertainties. • Used to describe uncertainty that does not fit into other 2 categories.
Examples from a review of the HTA literature • Inclusion/exclusion of potentially relevant comparators. • The selection of comparators should be informed by current evidence or opinion. • Choice of comparators is often governed by the scope of the model, and rarely are all possible comparisons made. • This is often the case where unlicensed comparators exist. Even if the excluded comparators are not cost-effective, excluding them may change EVPI estimates. • Inclusion/exclusion of potentially relevant events. • The process of simplification will inevitably require certain assumptions to be made. • These assumptions should be supported by evidence and choices between alternative assumptions should be justified and made transparent. • Events thought to be unrelated to treatment can have a noticeable impact on estimates of cost-effectiveness and EVPI.
Examples from a review of the HTA literature (2) • Statistical models to estimate specific parameters. • Decision models are using increasingly sophisticated statistical techniques to derive estimates of parameters. • This increased complexity can introduce statistical uncertainties. • May be alternative survival models – each plausible given data. Which model is best for survival beyond the observed period? • Clinical uncertainty or lack of clinical evidence. • A decision model may be commissioned on the basis of a lack of clinical evidence to inform a decision. • Even when RCT evidence is available there may be an absence of evidence about key parameters such as treatment effect, baseline event rates, clinical pathways, interaction between model parameters and clinical practice norms. • Often scenarios are presented based on alternative but extreme assumptions that could be made.
Identifying current approaches to characterise structural uncertainty • Undertook a review to find methods which explore the types of structural uncertainties apparent in DAM • Focus on analytical methods rather than qualitative methods of synthesis. • Very little published in HTA literature. Methods from statistics, mathematical and operational research are relevant. • 3 methods are available:
Available methods • Scenario analysis: • Alternative assumptions presented as separate scenarios • Multiple models to digest • Model selection • Rank alternative models according to some measure of prediction performance, goodness of fit or probability of error • Choose the model that maximises that particular criterion • In HTA decision modelling it is difficult to define a ‘gold standard’ for outcomes and costs • Where there are many competing objectives, it is often not possible to identify one particular parameter whose performance must be maximised by a fitted model • Absence of data required to assess fit. • Not always advantageous to choose the best model – discards information on other alternative model
Available methods (2) • Model averaging: • Build alternative models and average their results, weighted by some measure of their adequacy or credibility • Models can be assigned equal weights or differential weights can be determined using either ranking measures or derived using expert elicitation methods. • Bayesian methods for model averaging are commonly used in mathematics and statistics. • Issue of determining the posterior distribution of a parameter given the data, when the data may not be available • Non-bayesian methods can be used • Require a a measure of uncertainty that captures both uncertainty between expectations and uncertainty within expectations. • Model averaging must be undertaken for each realisation of the uncertain parameters so as not to underestimate uncertainty.
Available methods (3) • Parameterising structural uncertainty: • Approach not identified in the review • Assumptions that distinguish different models or scenarios can often be thought of as either missing parameters or parameters assigned a single and often extreme value. • By generalising the model, to include additional ‘uncertain’ parameters the source of structural uncertainty can be represented directly in the analysis. • This approach is analogous to model averaging on individual or sets of model inputs. • Treats structural uncertainty like parameter uncertainty
Case studies • See: Characterizing Structural Uncertainty in Decision Analytic Models: A Review and Application of Methods. Laura Bojke, Karl Claxton, Mark Sculpher, Stephen Palmer. Value in Health, 2009 forthcoming.
Discussion points and further work • When is structural uncertainty really parameter uncertainty? • Is uncertainty that can be parameterised directly in the model structural uncertainty? • If not, what is structural uncertainty? • Do we really need a definition? • Do issues of what comparators to include/exclude relate to defining the decision scope? • Do issues of what events to include/exclude relate to specifying the correct model structure – avoiding over simplification?
Discussion points and further work (2) • Should the focus be on establishing guidelines on how to structure a model? • Is a lot of structural uncertainty just modeller uncertainty? • Should we average across models? • Does this reflect uncertainty about structure? • How do we determine weights? • Expert opinion – method of elicitation, which experts?