310 likes | 440 Views
Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International Agricultural Research. Overview/Introduction. Use and non-use of impact evaluation: the CGIAR case Douglas Horton & Ronald Mackay, Independent evaluation consultants
E N D
Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International Agricultural Research AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009
Overview/Introduction • Use and non-use of impact evaluation: the CGIAR case Douglas Horton & Ronald Mackay, Independent evaluation consultants • Towards a broader range of impact evaluation methods for collaborative research: report on a work in progress Patricia Rogers, Royal Melbourne Institute of Technology & Jamie Watts, CGIAR Institutional Learning and Change Initiative • Role of Impact Evaluation in Moving from Research into Use Sheelagh O’Reilly, Team Leader, Impact Evaluation, Research into Use Programme
Programme • Combined presentation • Reaction from Robert Chambers, Discussant • Q&A and Discussion
Use and Non-Use of Impact Evaluation: the CGIAR Case Douglas Horton & Ronald Mackay
Overview • CGIAR has a long history of producing high-quality impact evaluations • However, there has been limited use of findings: • To influence donor / investor decisions & resource allocations • To promote learning & program improvement • Use may be enhanced somewhat through better planning and communication, but there remain some inherent problems with all disciplinary-oriented evaluation approaches • Other ways of evaluating and fostering learning are needed for social / institutional learning and for policy and program improvement
History of IE in the CGIAR • High estimated returns to investment in ag. research were key to establishing the CGIAR • Hundreds of economic impact assessments report high rates of return • CGIAR economists have contributed significantly to improving IA theory & methods
From the Studies … “CGI [crop genetic improvement] programmes have been outstanding investments. Few investments can come close to achieving the poverty reduction per dollar expended that the CGI programmes evaluated in this volume have realized… Any reduction in support to agricultural projects, in particular to projects designed to improve productivity, will seriously limit and hamper efforts to reduce mass poverty.” (Evenson & Rosegrant, 2003: 496)
The Emerging Paradox “Concern is growing within the donor community relating to the effectiveness of existing impact assessment research in guiding international agricultural research... donor support for agricultural research is declining, despite the credible assessments showing that investment in this area indeed has had high return.” (Gregersen & Morris, 2003: vii) “There is little apparent relationship between impact assessment findings and the subsequent allocation patterns of donors… those areas of research with the highest levels of assessed benefits often suffer from declining funding, while unproven areas of research and non-research investment receive rising funding shares” (Raitzer & Winkel, 2005: ix)
Funding to International Agricultural Research (Source: ASTI Initiative) 500 millions of 2005 US dollars 450 400 350 300 250 200 150 100 Unrestricted 50 Total 0 Years 1961 1971 1981 1991 2001
What is Going On Here? • Good (impact evaluation) research does not necessarily lead to policy / programme support. • Many factors may affect policy & management decisions more than (evaluation) information). • For any kind of evaluation to have an impact, use needs to be cultivated from the beginning. • One type of IE may not meet all needs
Some factors influencing use • Engagement of intended users • The 4 “I’s” • Types and levels of use • Attention to use
Engagement of Potential/Intended Users • Donors & development agencies • Policymakers • Center / program managers • Researchers • Peers • Constituents / intended beneficiaries
Why engage users? Influence on decision making Use of Findings “Process Use” Engagement
Four “Is” • Interests • Ideologies • Institutions • Information (Weiss, 1998)
Types and Level of Use Decision level Type of use Direct/instrumental Indirect/conceptual Symbolic Strategic Structural Operational
Attention to Communication • Multiple forms of communication • Match format to audience • Long-term involvement • Integrate evaluation into program • Guard against standardization • Involve stakeholders • Create context for dialogue
Suggestions View and manage IE as “evaluation,” not as “research.” • Plan and manage evaluations to foster specific uses. • Target specific policies and program related issues. • Explain how programmes or projects attain results in their context. • Use mixed methods from various disciplines as needed to respond to evaluation questions. • Judge them for usefulness, practicality, respect for propriety and accuracy of data and results
Towards a broader range of impact evaluation methods…Why? Agricultural research has expanded into a broader range of areas • From crop improvement to higher level development goals Role of the researcher in the agricultural innovation system is changing • From center of excellence to collaborative and capacity building approach • From transfer of technology to demand driven, locally relevant solutions Traditional evaluation designs may not always be feasible or appropriate
Increasingly diverse portfolio • Impact assessment of genetic improvement of major crops well represented • Somewhat represented biological control of pests • Under represented in IA portfolio: • crop and integrated pest management • livestock • natural resources management • post harvest technologies • policy and gender research
Increasingly collaborative research Source: Douthwaite 2004.
Increasing demand to engage intended end-users: • Increase researchers’ understanding of local issues to improve the relevance of research to local conditions • Increase uptake and appropriate adaptation • Incorporate local knowledge into research • Co-production of knowledge by researchers and community members • Develop end-users’ capacity to build and use knowledge for adaptive management
Spectrum of participation • Conventional research: scientists make the decisions alone without organized participation by end-users • Contractual: scientists contract with end-users to participate. • Consultative: scientists make decisions but with organized communication with end-users • Collaborative: decision-making authority is shared between end-users and scientists. Neither party can revoke or override a joint decision. • Collegial: end-users make decisions collectively either in a group process or through individual end-users who are in organized communication with scientists. • End-user experimentation: end-users make the decisions without organized communication with scientists. (adapted from Lilja and Ashby) ` Scope of this work
Conceptualising translational research [Nobel laureate Sydney] Brenner is one of many scientists challenging the idea that translational research is just about carrying results from bench to bedside, arguing that the importance of reversing that polarity has been overlooked. “I’m advocating it go the other way,” Brenner said.
Formulae are critical and necessary Sending one rocket increases assurance that next will be ok High level of expertise in many specialized fields + coordination Rockets similar in critical ways High degree of certainty of outcome Formulae have only a limited application Raising one child gives no assurance of success with the next Expertise can help but is not sufficient; relationships are key Every child is unique Uncertainty of outcome remains Following a RecipeA Rocket to the MoonRaising a Child Simple Complicated Complex • The recipe is essential • Recipes are tested to assure replicability of later efforts • No particular expertise; knowing how to cook increases success • Recipes produce standard products • Certainty of same results every time (Diagram from Zimmerman 2003)
The need for broader range of methods • Complement existing methods for Impact Evaluation(raising issues of multidisciplinary and mixed methods) • Identify, describe, measure and value impacts • Assess causal inference in collaborative and/or participatory projects • Support the use of impact evaluation for learning and adaptive management
Entry Points for learning and change • Knowledge, skills & attitudes • People need to want to learn and know how to engage partners in co-creation of knowledge • Management systems & practices • Leaderslearn, value learning, and promote learning in concrete ways • Communication channelsfacilitate easy access to information and knowledge sharing • Systems and structuresfacilitate learning • Organizational culture • Supports and rewards reflection & learning and the application of lessons • External environment • Is conducive to reflection and learning from experience
Visualising the connection between laboratory research and practice research Tabak, 2005 National Institute of Dental and Crano-Facial Research, National Institutes of Health
Capacity for organizational learning • Systematically gathering information • Making sense of information • Sharing knowledge and learning • Drawing conclusions and developing guidelines for action • Implementing action plans • Institutionalizing lessons learned and applying them to new and on-going work
Research Into Use Programme How can innovation-system approaches promote and facilitate greater use of research-based knowledge • Maximise the poverty-reducing impact of previous research on natural resources • Develop understanding of how innovation-system approaches contribute to reducing poverty whilst ensuring effective and efficient management of natural resources. • Challenges to impact evaluation • need to identify critical success factors • coherent approaches for spotting ‘potential winners’ among research outputs, in the move from research into innovation • mainstream use of new technologies that contribute to poverty reduction and economic growth.