240 likes | 337 Views
Meteo Mali Agrometeorological Program Evaluation: Preliminary Report. Edward R. Carr Department of Geography University of South Carolina. Assessment Background. June 2011 meeting in Dakar Demand-driven assessment Lessons learned/good practices Scaling up. Assessment Design.
E N D
Meteo Mali Agrometeorological Program Evaluation: Preliminary Report Edward R. Carr Department of Geography University of South Carolina
Assessment Background • June 2011 meeting in Dakar • Demand-driven assessment • Lessons learned/good practices • Scaling up
Assessment Design Three components • Science assessment • IRI • Institutional assessment • CCAFS • Field assessment • CCAFS/IER/University of South Carolina
Science Assessment Goals • What climate information is provided to farmers? • What is the scientific basis for this information? • What is the translation and dissemination process? • What opportunities are there for improving the quality and relevance products? • What challenges have been encountered in satisfying specific user needs?
Science Assessment February 2012 • Meetings with Mali Meteo • Consultations with AGRHYMET and ACMAD • Review of methods and documentation • Integration with field assessment findings
+ Raingauge *Agroclimate o Synoptic Science Assessment Draft Assessment prepared at the end of July 2012 • Example challenges • Difficulty of providing reliable local-scale forecasts • Onset of the rainy season • Timing of possible dry spell. • Need for monthly forecasts • Lack of verification information Mali’s network of meteorological stations
Science Assessment Draft Assessment • Example opportunities • Prospects for improved downscaling • Merging satellite and station data • Using Global Producing Centre (GPC) model outputs to strengthen seasonal and monthly forecasts Distribution of known GLAM villages
Institutional Assessment Learn what institutional factors contributed to program success • Narrative program history • Identification of product development process • Mapping of the changing flow of information, products, and resources in the program
Institutional Assessment June – November 2012 • Responses from 12 informants • Follow-ups ongoing • Draft document prepared
Institutional Assessment Example draft findings • Coordinating group was highly interdisciplinary but informal • Continuous project funding allowed time to learn • Opportunities • Broader focus for information (including livestock and fisheries) • Formalized frameworks that entrench and support the interdisciplinarity of the program
Field Assessment Goal • Identify current impacts of the program on participants • Explain the causes of these impacts • Both extraordinarily difficult to do post-hoc
Field Assessment January 2012 – March 2012 • 36 villages • 18 GLAM, 18 control • 144 focus groups • 720 interviews • Men and women • Young and old
Field Assessment Broad assessment • Livelihoods practices • Agricultural activities • Engagement with NGOs • Engagement with the Agromet program
Field Assessment Analysis ongoing • Over 430,000 data • Validation of controls • No baselines • Too long a duration • Identification of groupings for analysis
Field Assessment Initial findings • Opportunities to build on end-user delivery • Opportunities to better target end-user needs • Current and in preparation for future needs • Opportunities to expand the user base • Heavily focused on younger men
Field Assessment Initial findings • Suggestions of impact • Differential numbers of crops grown/varieties used by those who use agromet data and those who do not • Varies by grouping/agroecological zone • Complex impact by crop
Field Assessment Limitations • Will have difficulty talking about yield impact • Will have difficulty establishing causality for impact Addressing the Limitations • Re-running the survey in February-March 2013 • Qualitative work in selected villages in May-July 2013
Coming soon… Integration of science and field assessments • Will look at science constraints and opportunities in context of end-user demands Rigorous assessment of impact at the end-user level Integrated lessons learned and good practices • Connecting science, institutional context, and end-user impact