1 / 27

What role should probabilistic sensitivity analysis play in SMC decision making?

What role should probabilistic sensitivity analysis play in SMC decision making?. Andrew Briggs, DPhil University of Oxford. What probabilistic modelling offers. Generating the appropriate (expected) cost-effectiveness

Download Presentation

What role should probabilistic sensitivity analysis play in SMC decision making?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What role shouldprobabilistic sensitivity analysisplay in SMC decision making? Andrew Briggs, DPhil University of Oxford

  2. What probabilistic modelling offers • Generating the appropriate (expected) cost-effectiveness • Reflects combined implications of parameter uncertainty on the outcome(s) of interest (cost-efectiveness) • Can make probability statements about cost-effectiveness results – error probability under decision maker’s control • Offers a means of to calculate the value of collecting additional information

  3. Role of probabilistic sensitivity analysis Overview • Data sources for parameter fitting • Distributions for common model parameters • Correlating parameters • Presenting simulation results • Using PSA for decision making • Continuing role of traditional sensitivity analysis • Micro simulation models

  4. Data sources for parameter estimation • Primary data • Can ‘fit’ parameters using standard statistical methods • Provides standard estimates of variance and correlation • Secondary data • With appropriate information reported can still fit parameters • Meta analysis may be possible • Expert opinion • Usefulness of Delphi limited (focus on consensus!) • Variability across estimates • Individual estimates of dispersion

  5. Distributions for common parametersProbability parameters • Probabilities are constrained on the interval zero-one • Probabilities must sum to one • Probabilities often estimated from proportions • Data informing estimation are binomially distributed • Use Beta distribution • May estimate probabilities from rates • E.G. from hazard rates in survival analysis • Use (multivariate) normal on log scale • Must make transformation from rates to probabilities

  6. Distributions for common parametersCost parameters • Costs are a mixture of resource counts and unit costs • Could model counts individually as Poisson with Gamma distributed mean (parameter) • Costs are constrained to be zero or positive • Can use Gamma distribution if cannot rely on the Central Limit Theorem (if skewed) • Popular alternative is log-normal, particular when using regression models on log cost

  7. Distributions for common parametersUtility parameters • Utilities are somewhat unusual with one representing perfect health and zero representing death • Can have states worse than death so constraints are negative infinity up to one • If far from zero, pragmatic approach is to fit beta distribution • If it is important to represent negative utilities consider the transformation X = 1- U (utility decrement) and fit Gamma or log normal distribution to X

  8. Distributions for common parametersRelative risk parameters • Relative risks are ratios! • Can log transform to make additive • Variances and confidence intervals are estimated on the log-scale then exponentiated • Suggests the log-normal distribution

  9. Relative risk from published meta-analysisExample • Suppose a published meta analysis quotes a relative risk of 0.86 with 95%CI(0.71 to 1.05) • Log transform these to give -0.15 (-0.35 to 0.05) on log scale • Calculate the SE on log scale: (0.05 - -0.35)/(1.96*2) = 0.1 • Generate a normally distributed random variable with mean –0.15 and SE 0.10 • Exponentiate the resulting variable

  10. Correlating parameters • PSA has sometimes been criticised for treating parameters as independent • In principle can correlate parameters if we have information on covariance structure • e.g. covariance matrix in regression • Cholesky decomposition used for correlated normal distributions • Correlations among other distributional forms not straightforward

  11. Variability and nonlinearity Even if we are interested only in expected values we need to consider uncertainty when nonlinearities are involved: E[ g(x) ]  g( E[x]) • Uncertainty needed to calculate expectation of nonlinear parameters • Uncertainty needed to calculate expectation of nonlinear models

  12. Point estimates and variability Standard point estimate Expected value RR: 0.86 (95% CI: 0.71-1.05)

  13. A model of Total Hip ReplacementExample: interpreting simulation results

  14. Example on the CE plane Spectron versus Charnley Hip prosthesis

  15. Corresponding CEAC Spectron versus Charnley Hip prosthesis

  16. Multiple acceptability curvesWhy and how? • Two reasons for employing multiple acceptability curves • Heterogeneity between patient groups • Multiple treatment options • Correspond to two situations in CEA • Independent programmes • Mutually exclusive options • Lead to two very different representations!

  17. Multiple CEACs: handling heterogeneitySpectron versus Charnley (Males)

  18. Multiple CEACs: handling heterogeneitySpectron versus Charnley (Females)

  19. Example: GERD managementBaseline results

  20. Example: GERD managementUncertainty on the CE plane

  21. Example: GERD managementMultiple CEACs

  22. Using probabilistic analysis for making decisions? Link with standard statistical methods 1. Use standard inference (link with frequentist methods) 2. Use cost-effectiveness acceptability curves to allow decision maker to select own ‘threshold’ error probability (more Bayesian) 3. Use PSA to establish the value of collecting additional information to inform decision (fully Bayesian decision theoretic approach)

  23. Cost of uncertainty (value of information)

  24. Micro-simulation models and PSA • Microsimulation is an ‘individual’ (rather than ‘cohort’) method of model evaluation • Typically used to capture patient histories • Calculation requires large number of individual simulations • PSA would require a second ‘layer’ of simulations (increases computational time) • Think carefully about whether a micro simulation is necessary • If it is, buy a fast machine, or use an approximate solution

  25. Is there any role for standard sensitivity analysis? • Probabilistic sensitivity analysis is important for capturing parameter uncertainty • Other forms of uncertainty relate to • Methodology • Structural uncertainty • Data sources • Heterogeneity • Standard sensitivity analysis retains an important role (in conjuction with PSA)

  26. Critiquing a probabilistic CE model • Are all parameters included in PSA? • Were standard distributions specified? • No triangular/uniform distributions • Was the appropriate expected value calculated? • Was standard sensitivity analysis employed to handle non sampling uncertainty? • Was heterogeneity handled separately? • Was the effect of individual parameters explored?

  27. Summary: the role of PSA PSA has important role to play • Calculating the correct expected value • Calculating combined effect of uncertainty in all parameters • Opening the debate about appropriate error probability • Required to calculate the value of information • Continuing role for standard sensitivity analysis

More Related