10 likes | 148 Views
Implementing Global Climate Change Research: Assessing the Challenge of Defining and Evaluating Decision Support Rebecca J. Romsdahl, PhD – Earth System Science & Policy Program, UND – rebecca.romsdahl@und.edu. Results & Discussion. Introduction. Conclusions.
E N D
Implementing Global Climate Change Research: Assessing the Challenge of Defining and Evaluating Decision Support Rebecca J. Romsdahl, PhD – Earth System Science & Policy Program, UND – rebecca.romsdahl@und.edu Results & Discussion Introduction Conclusions When asked how they define DS, based on their experiences, 16 respondents (45%) reference ‘tools,’ only 4 (11%) include decision-makers, none reference collaboration or uncertainties, see Figure 1. Also, 24 (68%) agree with the statement: ‘Decision support is a new label for a long established line of work.’ Decision support (DS) is the buzzword concept for implementing global climate change research; the aim is to develop better science-based resources to aid decision-making under the uncertain conditions of climate variability and change.1 Survey responses show there is disagreement on how to define DS, who it should involve, what is needed by decision-makers, and how to evaluate its effectiveness; there are also suggestions that some researchers do not support the concept. This author argues that DS should be broadly defined and evaluated as a collaborative process. • ‘Pure’ science researchers will think it is not their job to worry about what happens to the research activity once they have finished a project…. If this is true, then we need an intermediary ‘technology/research’ transfer process to move from pure research to decision support. • - Survey Respondent • Increasing emphasis on uncertainty levels, socioeconomics, dialogue, implementation • Linear Knowledge Transfer Climate Extension Collaborative Process • Increasing emphasis on experimentation, data collection, technology tools • Figure 1a: Decision Support Definition Continuum Rather than continue to define and evaluate DS via the predominant linear model of knowledge transfer, see Figure 1, where products or information are handed down from the expert to the decision-maker, a process-based model of DS, see Figure 2, would recognize relevant uncertainties involved with global climate change data and encourage dialogue on how to incorporate uncertainty into the decision process. This type of model would also encourage evaluation and feedback to help assess the effectiveness of DS activities in addressing climate variability and change. When asked to list examples of ‘effective DS activities’ related to their program, nearly half of respondents describe projects involving collaboration between researchers and decision-makers. Many of the described projects are similar to the collaborative problem solving design institutionalized in the US Environmental Protection Agency2, see Table 2. Research Process In order to better understand how DS is being defined and evaluated, global climate change experts, within the US Climate Change Science Program (CCSP), see Table 1, were invited to participate in an Internet-based survey; 35 valid responses were analyzed. • Collaborative Problem Solving Design • Identify priorities • Get the right groups to the table • Provide education / training as needed • Determine roles of each organization • Frame a mutually agreeable goal • Manage the process to be fair, objective, timely • Decide how to document / evaluate outcomes • Table 2 • Ad-hoc DS Evaluation • Benchmarking • Case-by-case analysis • Reports/publications • Stakeholder feedback • Surveys • Table 3 • The Sample Population Survey participants are senior-level Federal agency representatives to CCSP working groups. As representatives, they bring an overview of their agency’s research and development portfolio and some level of budgetary authority. • NOAA (n=8) • NASA (n=7) • Dept. Agriculture (n=6) • USGS (n=6) • EPA (n=4) • Dept. Energy (n=3) • Dept. Interior (n=3) • Dept. Health & Human Services (n=2) • NSF (n=2) • Smithsonian Institute (n=2) • Dept. Transportation (n=2) • Table 1 When asked if their ‘program always evaluates the effectiveness of DS products,’ 22 respondents (62%) indicate ‘no.’ References to ad-hoc evaluation, however, include formal and informal strategies, see Table 3. • Survey Results in Summary: • Collectively, responses highlight continued uncertainty with DS concept • Responses indicate that although DS constitutes a variety of products and activities, that range from interactive climate-modeling software to extension-like advice and training, a common definition is needed. • Responses also indicate that evaluation of DS products and activities is essential, but there is disagreement on: what should be evaluated, just the products and / or DS implementation; when and how to evaluate effectiveness; and who should conduct evaluation, researchers, decision-makers, or somebody else? References • p. 111, Climate Change Science Program. 2003. Strategic Plan • p. 144, Belefski, M. 2006. “Collaboration at the U.S. Environmental Protection Agency: • An Interview with Two Senior Managers.” Public Administration Review, December • Figure 1 adapted from: Pyke, C. et.al. (in press) “Effective Decision Support for Climate • Change Impact Assessment and Adaptation.” • Figure 2 adapted from: Horsefall, F. and H. Hill, 2004. “NOAA Climate Transition Program. Conceptual Paper.” Available at: http://www.climate.noaa.gov/cpo_pa/nctp/nctp.pdf