1 / 52

INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction. Dr. Jeroen van der Sluijs. Copernicus Institute for Sustainable Development and Innovation Utrecht University. &.

raheem
Download Presentation

INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INTARESE Uncertainty Training 17-18 Oct 2007Knowledge Quality Assessmentan introduction Dr. Jeroen van der Sluijs Copernicus Institute for Sustainable Development and InnovationUtrecht University & Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France

  2. Haugastøl group: Jeroen van der Sluijs; Ragnar Fjelland; Jerome Ravetz; Anne Ingeborg Myhr; Roger Strand; Silvio Funtowicz; Kamilla Kjølberg; Kjellrun Hiis Hauge; Bruna De Marchi; Andrea Saltelli

  3. Complex - uncertain - risks Typical characteristics (Funtowicz & Ravetz): • Decisions will need to be made before conclusive scientific evidence is available; • Potential impacts of ‘wrong’ decisions can be huge • Values are in dispute • Knowledge base is characterized by large (partly irreducible, largely unquantifiable) uncertainties, multi-causality, knowledge gaps, and imperfect understanding; • More research  less uncertainty; unforeseen complexities! • Assessment dominated by models, scenarios, assumptions, extrapolations • Many (hidden) value loadings reside in problem frames, indicators chosen, assumptions made

  4. Model structure uncertainty... 5 consultants, each using a different model were given the same question: “which parts of this particular area are most vulnerable to pollution and need to be protected?” (Refsgaard et al, 2006)

  5. 3 paradigms of uncertain risks 'deficit view' • Uncertainty is provisional • Reduce uncertainty, make ever more complex models • Tools: quantification, Monte Carlo, Bayesian belief networks 'evidence evaluation view' • Comparative evaluations of research results • Tools: Scientific consensus building; multi disciplinary expert panels • focus on robust findings 'complex systems view / post-normal view' • Uncertainty is intrinsic to complex systems • Uncertainty can be result of production of knowledge • Acknowledge that not all uncertainties can be quantified • Openly deal with deeper dimensions of uncertainty (problem framing indeterminacy, ignorance, assumptions, value loadings, institutional dimensions) • Tools: Knowledge Quality Assessment • Deliberative negotiated management of risk

  6. Ravetz, 1971: Scientific Knowledge and its Social Problems Practical/Technical problems • Practical problems: problems for which the solution consist of the achievement of human purposes. • Technicalproblems: defined in terms of the function to be performed. In modern societies, practical problems are reduced to a set of technical problems.

  7. Uncertainty in knowledge based society: the problems 1984 Keepin & Wynne: “Despite the appearance of analytical rigour, IIASA’s widely acclaimed global energy projections are highly unstable and based on informal guesswork. This results from inadequate peer review and quality control, raising questions about political bias in scientific analysis.”

  8. Crossing the disciplinary boundaries Once environmental numbers are thrown over the disciplinary fence, important caveats tend to be ignored, uncertainties compressed and numbers used at face value e.g. Climate Sensitivity, see Van der Sluijs, Wynne, Shackley, 1998: ? ! Resulting misconception: Worst case = 4.5°C 1.5-4.5 °C

  9. The certainty trough (McKenzie, 1990)

  10. Insights on uncertainty • More research tends to increase uncertainty • reveals unforeseen complexities • Complex systems exhibit irreducible uncertainty (intrinsic or practically) • Omitting uncertainty management can lead to scandals, crisis and loss of trust in science and institutions • In many complex problems unquantifiable uncertainties dominate the quantifiable uncertainty • High quality  low uncertainty • Quality relates to fitness for function (robustness, PP) • Shift in focus needed from reducing uncertainty towards reflective methods to explicitly cope with uncertainty and quality

  11. Clark & Majone 1985Critical Appraisal of Scientific Inquiries with Policy Implications 1. Criticism by whom? Critical roles • Scientist • Peer group • Program Manager or Sponsor • Policy maker • Public interests groups

  12. Clark & Majone 1985 Criticism of what? Critical modes: • Input • data; methods, people, competence, (im)matureness of field • Output • problem solved? hypothesis tested? • Process • good scientific practice, procedures for review, documenting etc.

  13. (Clark & Majone, 1985)

  14. Clark & Majone 1985 Meta quality criteria: • Adequacy • reliability, reproducibility, uncertainty analysis etc. • Value • Internal: how well is the study carried out? • External: fitness for purpose, fitness for function • Personal: subjectivity, preferences, choicesd, assumptions, bias • Effectiveness • Does it help to solve practical problems • Legitimacy • numinous: natural authority, independance, credibility, competence • civil: agreed procedures

  15. KQA tools • Quantitative methods • SA/UA Monte Carlo • Uncertainty typology (matrix) • Quality assessment • Pedigree analysis (NUSAP) • Assumption analysis • Model Quality Checklist • MNP Uncertainty Guidance • Extended Peer Review • Argumentative Discourse Analysis (ADA); Critical Discourse Analysis (CDA) • ....

  16. NL Environmental Assessment Agency (RIVM/MNP) Guidance: Systematic reflection on uncertainty & quality in:

  17. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

  18. Problem framing and context • Explore rival problem frames • Relevant aspects / system boundary • Typify problem structure • Problem lifecycle / maturity • Role of study in policy process • Uncertainty in socio-political context

  19. Type-III error: Assessing the wrong problem by incorrectly accepting the false meta-hypothesis that there is no difference between the boundaries of a problem, as defined by the analyst, and the actual boundaries of the problem (Dunn, 1997). Context validation (Dunn, 1999). The validity of inferences that we have estimated the proximal range of rival hypotheses. Context validation can be performed by a participatory bottom-up process to elicit from scientists and stakeholders rival hypotheses on causal relations underlying a problem and rival problem definitions.

  20. What is the role of the assessment in the policy process? • ad hoc policy advice • to evaluate existing policy • to evaluate proposed policy • to foster recognition of new problems • to identify and/or evaluate possible solutions • to provide counter-expertise • other

  21. In different phases of problem lifecycle, different uncertainties are salient

  22. Different problem-types need • different uncertainty management strategies

  23. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

  24. Involvement of stakeholders • Identify relevant stakeholders. • Identification of areas of agreement and disagreement among stakeholders on value dimensions of the problem. • Recommendations on when to involve different stakeholders in the assessment process.

  25. Roles of stakeholders • (Co-) definer of the problems to be addressed • What knowledge is relevant? • Source of knowledge • Quality control of the science (for instance: review of assumptions)

  26. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

  27. Indicators • How well do indicators used address key aspects of the problem? • Use of proxies • Alternative indicators? • Limitations of indicators used? • Scale and aggregation issues • Controversies in science and society about these indicators?

  28. High uncertainty is not the same as low quality Example: imagine the inference is Y = the logarithm of the ratio between the two pressure-on-decision indices PI1 and PI2 Region where Region where Incineration Landfill is preferred is preferred Frequency of occurrence Y=Log(PI 1/PI 2)

  29. High uncertainty is not the same as low quality, but..... methodological uncertainty can de dominant (slide borrowed from Andrea Saltelli)

  30. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

  31. Adequacy ofavailable knowledge base? • What are strong and weak points in the knowledgebase? • Use of proxies, empirical basis, theoretical understanding, methodological rigor, validation • NUSAP Pedigree analysis • What parts of the knowledge are contested (scientific and societal controversies)? • Is the assessment feasible in view of available resources? (limitations implied)

  32. Dimensions of uncertainty • Technical (inexactness) • Methodological (unreliability) • Epistemological (ignorance) • Societal (limited social robustness)

  33. Reliability intervals in case of normal distributions   = 68 %  2 = 95 %  3 = 99.7 %

  34. 95%confidence-interval

  35. NUSAPQualified Quantities • Numeral • Unit • Spread • Assessment • Pedigree (Funtowicz and Ravetz, 1990)

  36. NUSAP: Pedigree Evaluates the strength of the number by looking at: • Background history by which the number was produced • Underpinning and scientific status of the number

  37. Example Pedigree matrix parameter strength

  38. Example Pedigree results Trafic-light analogy <1.4 red; 1.4-2.6 amber; >2.6 green This example is the case of VOC emissions from paint in the Netherlands, calculated from national sales statistics (NS) in 5 sectors (Ship, Building & Steel, Do It Yourself, Car refinishing and Industry) and assumptions on additional thinner use (Th%) and a lump sum for imported paint and an assumption for its VOC percentage. See full research report on www.nusap.net for details.

  39. Example: Air Quality

  40. Similar to a patient information leaflet alerting the patient to risks and unsuitable uses of a medicine, NUSAP enables the delivery of policy-relevant quantitative information together with the essential warnings on its limitations and pitfalls.It thereby promotes responsible and effective use of science in policy processes.

  41. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

  42. Mapping and prioritization of relevant uncertainties • Highlight uncertainties in typology relevant to this problem • Set priorities for uncertainty assessment • Select uncertainty assessment tools from the tool catalogue

  43. Typology of uncertainties • Location • Level of uncertainty statistical uncertainty, scenario uncertainty, recognised ignorance • Nature of uncertainty knowledge-related uncertainty, variability-related uncertainty • Qualification of knowledge base (Pedigree) weak, fair, strong • Value-ladenness of choices small, medium, large

  44. Locations of uncertainties: • Context ecological, technological, economic, social and political representation • Expert judgement narratives, storylines, advices • Model model structure, technical model, model parameters, model inputs • Data measurements, monitoring data, survey data • Outputs indicators, statements

  45. Tool catalogue For each tool: • Brief description • Goals and use • What sorts and locations of uncertainty does this tool address? • What resources are required to use it? • Strengths and limitations • guidance on application & complementarity • Typical pitfalls of each tool • References to handbooks, example case studies, web-sites, experts etc.

  46. Tool catalogue • Sensitivity Analysis • Error propagation equations • Monte Carlo analysis • Expert Elicitation • Scenario analysis • NUSAP • PRIMA • Checklist model quality assistance • Assumption analysis • …...

  47. Systematic reflection on uncertainty issues in: • Problem framing • Involvement of stakeholders • Selection of indicators • Appraisal of knowledge base • Mapping and assessment of relevant uncertainties • Reporting of uncertainty information

More Related