170 likes | 314 Views
UFP workshop 25/26 August 2008 Utrecht Expert Judgement and uncertainty. Dr. Jeroen P. van der Sluijs. Copernicus Institute for Sustainable Development and Innovation Utrecht University. &.
E N D
UFP workshop 25/26 August 2008 Utrecht Expert Judgementand uncertainty Dr. Jeroen P. van der Sluijs Copernicus Institute for Sustainable Development and InnovationUtrecht University & Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France
Expert Elicitation Workshop • Day 1: causal pathways, mechanisms connecting UFP exposure to health effects • Judgement on level of evidence • Uncertainty • Day 2: quantification Exposure Response Functions • Elicitation of Probability Density Functions • Uncertainty
3 framings of uncertainty 'deficit view' • Uncertainty is provisional • Reduce uncertainty, make ever more complex models • Tools: quantification, Monte Carlo, Bayesian belief networks 'evidence evaluation view' • Comparative evaluations of research results • Tools: Scientific consensus building; multi disciplinary expert panels • focus on robust findings 'complex systems view / post-normal view' • Uncertainty is intrinsic to complex systems • Uncertainty can be result of production of knowledge • Acknowledge that not all uncertainties can be quantified • Openly deal with deeper dimensions of uncertainty (problem framing indeterminacy, ignorance, assumptions, value loadings, institutional dimensions) • Tools: Knowledge Quality Assessment • Deliberative negotiated management of risk
Dimensions of uncertainty • Technical (inexactness) • Methodological (unreliability) • Epistemological (ignorance) • Societal (limited social robustness)
Locations of uncertainties: • Context system boundary, definitional vagueness • Expert judgement estimates, variability in (implicit) assumptions • Model causal structure, technical model, model parameters, model inputs • Data measurements, monitoring data, survey data • Outputs indicators, statements
Expert Elicitation • To systematically make explicit and utilizable unwritten knowledge in the heads of experts, including insight in the limitations, strengths and weaknesses of that knowledge
Rating-Scale for judgements on causality A level of confidence characterizes uncertainty based on expert judgement as to the correctness of a causal pathway being true (IPCC, 2005)
Procedure • First round: individual expert judgements • Document key studies and main arguments underpinning your rating • Show (variability of) individual results to group • Group discussion focussing on reasons for disagreement / argumentations • Opportunity to revise initial individual judgements
Pitfalls in expert elicitation • Overconfidence • Representativeness • Anchoring • Bounded rationality • Availability / lamp posting • Implicit assumptions • Motivational bias • Possibility of strategic answers • Interests with regard to outcome of analysis
Overconfidence Experts tend to over-estimate their ability to make quantitative judgements Difficult to guard against; but a general awareness of the tendency can be important
Representativeness Tendency to place too much confidence in a single piece of (familiar) information that is considered reliable Reliability representativeness Ignoring a larger body of more generalized information or other sources of information
Anchoring Assessments are often unduly weighted toward the conventional value, or first value given, or to the findings of previous assessments in making an assessment.
Bounded rationality Everyone has his own “blinkers” / limited view on reality Try to involve as many different viewpoints / disciplines as possible If you suspect that some other scholar in your field would disagree with your estimate, please mention that
Availability bias • The tendency to give too much weight to readily available data or recent experience (which may not be representative of the required data) • Lamp-posting
Implicit assumptions • A subject's responses are typically conditional on various unstated assumptions. • The effect of these assumptions is often to constrain the degree of uncertainty reflected in the resulting estimate of a quantity. • Stating assumptions explicitly can help reflect more of a subject's total uncertainty.
Motivational bias • By their judgements, experts can influence the outcome of a research project. • This could lead to strategic answers to promote an interest in stead of what the expert truly believes.
(Remaining) Program day 1 • Explantion first questions of protocol • Initial rating of confidence in overall causal links UFP and health end-points • coffee break • feedback of individual ratings • Group discussion • final rating • Lunch • Same procedure for individual causal pathways