520 likes | 614 Views
INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction. Dr. Jeroen van der Sluijs. Copernicus Institute for Sustainable Development and Innovation Utrecht University. &.
E N D
INTARESE Uncertainty Training 17-18 Oct 2007Knowledge Quality Assessmentan introduction Dr. Jeroen van der Sluijs Copernicus Institute for Sustainable Development and InnovationUtrecht University & Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France
Haugastøl group: Jeroen van der Sluijs; Ragnar Fjelland; Jerome Ravetz; Anne Ingeborg Myhr; Roger Strand; Silvio Funtowicz; Kamilla Kjølberg; Kjellrun Hiis Hauge; Bruna De Marchi; Andrea Saltelli
Complex - uncertain - risks Typical characteristics (Funtowicz & Ravetz): • Decisions will need to be made before conclusive scientific evidence is available; • Potential impacts of ‘wrong’ decisions can be huge • Values are in dispute • Knowledge base is characterized by large (partly irreducible, largely unquantifiable) uncertainties, multi-causality, knowledge gaps, and imperfect understanding; • More research less uncertainty; unforeseen complexities! • Assessment dominated by models, scenarios, assumptions, extrapolations • Many (hidden) value loadings reside in problem frames, indicators chosen, assumptions made
Model structure uncertainty... 5 consultants, each using a different model were given the same question: “which parts of this particular area are most vulnerable to pollution and need to be protected?” (Refsgaard et al, 2006)
3 paradigms of uncertain risks 'deficit view' • Uncertainty is provisional • Reduce uncertainty, make ever more complex models • Tools: quantification, Monte Carlo, Bayesian belief networks 'evidence evaluation view' • Comparative evaluations of research results • Tools: Scientific consensus building; multi disciplinary expert panels • focus on robust findings 'complex systems view / post-normal view' • Uncertainty is intrinsic to complex systems • Uncertainty can be result of production of knowledge • Acknowledge that not all uncertainties can be quantified • Openly deal with deeper dimensions of uncertainty (problem framing indeterminacy, ignorance, assumptions, value loadings, institutional dimensions) • Tools: Knowledge Quality Assessment • Deliberative negotiated management of risk
Ravetz, 1971: Scientific Knowledge and its Social Problems Practical/Technical problems • Practical problems: problems for which the solution consist of the achievement of human purposes. • Technicalproblems: defined in terms of the function to be performed. In modern societies, practical problems are reduced to a set of technical problems.
Uncertainty in knowledge based society: the problems 1984 Keepin & Wynne: “Despite the appearance of analytical rigour, IIASA’s widely acclaimed global energy projections are highly unstable and based on informal guesswork. This results from inadequate peer review and quality control, raising questions about political bias in scientific analysis.”
Crossing the disciplinary boundaries Once environmental numbers are thrown over the disciplinary fence, important caveats tend to be ignored, uncertainties compressed and numbers used at face value e.g. Climate Sensitivity, see Van der Sluijs, Wynne, Shackley, 1998: ? ! Resulting misconception: Worst case = 4.5°C 1.5-4.5 °C
The certainty trough (McKenzie, 1990)
Insights on uncertainty • More research tends to increase uncertainty • reveals unforeseen complexities • Complex systems exhibit irreducible uncertainty (intrinsic or practically) • Omitting uncertainty management can lead to scandals, crisis and loss of trust in science and institutions • In many complex problems unquantifiable uncertainties dominate the quantifiable uncertainty • High quality low uncertainty • Quality relates to fitness for function (robustness, PP) • Shift in focus needed from reducing uncertainty towards reflective methods to explicitly cope with uncertainty and quality
Clark & Majone 1985Critical Appraisal of Scientific Inquiries with Policy Implications 1. Criticism by whom? Critical roles • Scientist • Peer group • Program Manager or Sponsor • Policy maker • Public interests groups
Clark & Majone 1985 Criticism of what? Critical modes: • Input • data; methods, people, competence, (im)matureness of field • Output • problem solved? hypothesis tested? • Process • good scientific practice, procedures for review, documenting etc.
Clark & Majone 1985 Meta quality criteria: • Adequacy • reliability, reproducibility, uncertainty analysis etc. • Value • Internal: how well is the study carried out? • External: fitness for purpose, fitness for function • Personal: subjectivity, preferences, choicesd, assumptions, bias • Effectiveness • Does it help to solve practical problems • Legitimacy • numinous: natural authority, independance, credibility, competence • civil: agreed procedures
KQA tools • Quantitative methods • SA/UA Monte Carlo • Uncertainty typology (matrix) • Quality assessment • Pedigree analysis (NUSAP) • Assumption analysis • Model Quality Checklist • MNP Uncertainty Guidance • Extended Peer Review • Argumentative Discourse Analysis (ADA); Critical Discourse Analysis (CDA) • ....
NL Environmental Assessment Agency (RIVM/MNP) Guidance: Systematic reflection on uncertainty & quality in:
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information
Problem framing and context • Explore rival problem frames • Relevant aspects / system boundary • Typify problem structure • Problem lifecycle / maturity • Role of study in policy process • Uncertainty in socio-political context
Type-III error: Assessing the wrong problem by incorrectly accepting the false meta-hypothesis that there is no difference between the boundaries of a problem, as defined by the analyst, and the actual boundaries of the problem (Dunn, 1997). Context validation (Dunn, 1999). The validity of inferences that we have estimated the proximal range of rival hypotheses. Context validation can be performed by a participatory bottom-up process to elicit from scientists and stakeholders rival hypotheses on causal relations underlying a problem and rival problem definitions.
What is the role of the assessment in the policy process? • ad hoc policy advice • to evaluate existing policy • to evaluate proposed policy • to foster recognition of new problems • to identify and/or evaluate possible solutions • to provide counter-expertise • other
In different phases of problem lifecycle, different uncertainties are salient
Different problem-types need • different uncertainty management strategies
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information
Involvement of stakeholders • Identify relevant stakeholders. • Identification of areas of agreement and disagreement among stakeholders on value dimensions of the problem. • Recommendations on when to involve different stakeholders in the assessment process.
Roles of stakeholders • (Co-) definer of the problems to be addressed • What knowledge is relevant? • Source of knowledge • Quality control of the science (for instance: review of assumptions)
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information
Indicators • How well do indicators used address key aspects of the problem? • Use of proxies • Alternative indicators? • Limitations of indicators used? • Scale and aggregation issues • Controversies in science and society about these indicators?
High uncertainty is not the same as low quality Example: imagine the inference is Y = the logarithm of the ratio between the two pressure-on-decision indices PI1 and PI2 Region where Region where Incineration Landfill is preferred is preferred Frequency of occurrence Y=Log(PI 1/PI 2)
High uncertainty is not the same as low quality, but..... methodological uncertainty can de dominant (slide borrowed from Andrea Saltelli)
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information
Adequacy ofavailable knowledge base? • What are strong and weak points in the knowledgebase? • Use of proxies, empirical basis, theoretical understanding, methodological rigor, validation • NUSAP Pedigree analysis • What parts of the knowledge are contested (scientific and societal controversies)? • Is the assessment feasible in view of available resources? (limitations implied)
Dimensions of uncertainty • Technical (inexactness) • Methodological (unreliability) • Epistemological (ignorance) • Societal (limited social robustness)
Reliability intervals in case of normal distributions = 68 % 2 = 95 % 3 = 99.7 %
NUSAPQualified Quantities • Numeral • Unit • Spread • Assessment • Pedigree (Funtowicz and Ravetz, 1990)
NUSAP: Pedigree Evaluates the strength of the number by looking at: • Background history by which the number was produced • Underpinning and scientific status of the number
Example Pedigree results Trafic-light analogy <1.4 red; 1.4-2.6 amber; >2.6 green This example is the case of VOC emissions from paint in the Netherlands, calculated from national sales statistics (NS) in 5 sectors (Ship, Building & Steel, Do It Yourself, Car refinishing and Industry) and assumptions on additional thinner use (Th%) and a lump sum for imported paint and an assumption for its VOC percentage. See full research report on www.nusap.net for details.
Similar to a patient information leaflet alerting the patient to risks and unsuitable uses of a medicine, NUSAP enables the delivery of policy-relevant quantitative information together with the essential warnings on its limitations and pitfalls.It thereby promotes responsible and effective use of science in policy processes.
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information
Mapping and prioritization of relevant uncertainties • Highlight uncertainties in typology relevant to this problem • Set priorities for uncertainty assessment • Select uncertainty assessment tools from the tool catalogue
Typology of uncertainties • Location • Level of uncertainty statistical uncertainty, scenario uncertainty, recognised ignorance • Nature of uncertainty knowledge-related uncertainty, variability-related uncertainty • Qualification of knowledge base (Pedigree) weak, fair, strong • Value-ladenness of choices small, medium, large
Locations of uncertainties: • Context ecological, technological, economic, social and political representation • Expert judgement narratives, storylines, advices • Model model structure, technical model, model parameters, model inputs • Data measurements, monitoring data, survey data • Outputs indicators, statements
Tool catalogue For each tool: • Brief description • Goals and use • What sorts and locations of uncertainty does this tool address? • What resources are required to use it? • Strengths and limitations • guidance on application & complementarity • Typical pitfalls of each tool • References to handbooks, example case studies, web-sites, experts etc.
Tool catalogue • Sensitivity Analysis • Error propagation equations • Monte Carlo analysis • Expert Elicitation • Scenario analysis • NUSAP • PRIMA • Checklist model quality assistance • Assumption analysis • …...
Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information