1 / 39

Better, Faster Results Supporting Learning for Multiple Audiences

Better, Faster Results Supporting Learning for Multiple Audiences. THE EVALUATION ROUNDTABLE. The Rockefeller Foundation New York City September 27-28, 2017. Roundtable Theme and Context. The philanthropic sector and evaluation field are talking a lot about learning.

baylee
Download Presentation

Better, Faster Results Supporting Learning for Multiple Audiences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Better, Faster Results Supporting Learning for Multiple Audiences THE EVALUATION ROUNDTABLE The Rockefeller Foundation New York City September 27-28, 2017

  2. Roundtable Theme and Context

  3. The philanthropic sector and evaluation field are talking a lot about learning.

  4. Responsibility for ensuring foundations learn generally falls to evaluation staff. Tasks evaluation staff are responsible for Providing research or data to inform grantmaking strategy Evaluating foundation initiatives or strategies Refining grantmaking strategy during implementation Developing grantmaking strategy Designing and/or facilitating learning processes or events within the foundation Evaluating individual grants Compiling and/or monitoring metrics to measure foundation performance Designing and/or facilitating learning processes or events with grantees or other external stakeholders Improving grantee capacity for data collection or evaluation Conducting/commissioning satisfaction/perception surveys (of grantees or other stakeholders) Disseminating evaluation findings externally Source: Center for Effective Philanthropy and Center for Evaluation Innovation (2016). Benchmarking foundation evaluation practices.

  5. There are many audiences whose learning needs support. BOARD EXECUTIVE LEADERSHIP EXTERNAL ACTORS GRANTEES BENEFICIARIES PROGRAM STAFF We need to support learning within groups, as well as across them.

  6. Ensuring learning happens across multiple audiences is challenging with existing staffing and resourcing. Evaluation staffing as a percent of program staffing 1 evaluation staff person for every 10 program staff Source: Center for Effective Philanthropy and Center for Evaluation Innovation (2016). Benchmarking foundation evaluation practices.

  7. While the sector is talking a lot about learning, we lack clarity on what it means and what it takes, which ultimately risks it meaning nothing. Our goals are to: Get clarity about what high-quality, actionable, sustained learning means and requires. Help position you to be effective leaders and  supporters of high-quality learning across multiple audiences.

  8. Benchmarking Foundation Practices on Learning

  9. We wanted to better understand what is happening currently in philanthropy on learning, and how that is affecting your work. So we benchmarked foundation learning practices in 2017. We asked about: priority audiences relationship to evaluation resourcing and support tactics indicators of success

  10. We interviewed evaluation and learning leaders from 46 foundations. Based on most recent data available from The Foundation Center Unknown: ELMA Philanthropies

  11. Philanthropy is committing more explicitly to learning. Almost three-fourths said supporting learning is a high priority for their role. LEARNING was included in 59% of interviewee titles.

  12. This trend is fueled by different motivations. Alternative to Evaluation Remedy for Evaluation Use Philanthropic Best Practice Adaptation as Key to Strategy Positioned as different from formal evaluation that is linked primarily to accountability. Frustrated with a lack of evaluation use, learning helps connect data to action. Professional and high-functioning foundations demonstrate that they learn. Especially for foundations with emergent strategies that are deeply rooted in systems.

  13. In general, we have a shared notionof what learning entails. QUESTIONS DATA COLLECTION APPLICATION ANALYSIS • Also embodied as: • What? So what? Now what? • FOR MOST, LEARNING IS NOT JUST: • Knowledge capture • Dissemination/Information exchange • Training and technical assistance

  14. Foundations think differently about the relationship between evaluation and learning. 54% Lead with Learning Goal is to ensure learning occurs, with evaluation as one possible input. 33% Lead with Evaluation Goal is to make evaluation useful and ensure people learn from it. 13% Unsure

  15. As a result, foundations tend to emphasize different approaches for supporting learning. Build Capacity for Learning Facilitate Application of Data Approach Integrate learning into the work. Ensure evaluative inputs are used. Goal Build skills and know-how that reinforce positive learning habits. Ensure the right questions are asked, data collection is robust, and analysis and application occur. Role • Design data/evaluation plans to ensure they lead with, and are fit for, purpose • Disseminate data in accessible formats • Plan and facilitate learning events or activities • Integrate learning into grantmaking processes • Coach/train others on learning cycle elements • Create incentives for learning Typical Activities

  16. Leaning too far in either direction has clear risks. Lead with Learning Lead with Evaluation • Learning that happens sporadically, for specific people, requires facilitation, and can be disconnected from day-to-day work. • Learning that is not connected to systematically-collected data is subject to biases. Balancing both approaches is ideal. Limited resources make balance challenging.

  17. The majority of foundations prioritize support of internal audience learning over external. 86% ranked program staff 1st or 2nd

  18. Tactics tend to concentrate on internal foundation requirements and events. We heard very little about efforts to help audiences learn together. Board Grantees • Board materials • Dashboards • Evaluation results • MEL • Capacity building/coaching • Evaluation questions • Event Design • Retreats • Site visits • Education • Presentations • Guest speakers • Learning Events • Convenings/retreats Executive Leadership • Culture • Norms • Incentives • Coaching • Topics (DEI) • Strategy • Briefings • Reviews • Event Design • Staff meetings • Retreats • Dissemination • Products • Presentations Program Staff What learning capacities and habits are we building through these activities? • Strategy Support • Theories of change • Learning questions • Metrics/data • Portfolio reviews • Learning Events • Team meetings • BAR/AAR • Convenings • Retreats • Lunches • FailFest • MEL Support • Grant monitoring • External evaluation support • Coaching/Training • Learning champions Spending the most time here

  19. While foundations identified possible signals that learning is happening, few actually monitor them. Courage and Candor Buy-in of Activities Evidence of Application Demonstration of Capacity Demand for Support No one thinks they have learning nailed, especially across multiple audiences.

  20. You have many questions for each other about how to support learning. There are excellent practices in the room to learn from. To support program staff learning about grantees, developed a Salesforce-based “profiles platform” to track performance and impact, and successfully integrated its use into the grantmaking process. After adopting an equity approach, transferred all power for defining the learning agenda to grantees. Use communications expertise to deeply diagnose the learning needs of external audiences.

  21. We lack clarity about the rationale behind our strategies and tactics for supporting learning, and consensus about what high-quality learning looks like and requires.

  22. To get that clarity, we need to better define what we should be aiming for to ensure learning happens. Inputs Levers/Tactics Outcomes Impact Tools Staffing/$ Processes Programs Events Better, Faster Results Priorities Training Technology Rewards

  23. High-quality, actionable, sustained learning requires better theory building about the capacities and habits that lead to it. We posit the capacities and habits that lead to effective learning should include the following.

  24. 1. Thinking that is visible and testable. We use tools to do this • No actual theory, just description • Implausible theory, making huge leaps of reason and faith • Overly simplistic, often in seeming pursuit of clarity • Unclear causal linkages, assumes everything links to everything • Visible to only one audience Causal model Causal map Impact pathways Intervention theory Intervention logic Investment logic Logic model Outcomes hierarchy Outcomes framework Program logic Program theory Strategy map Theory of action Theory of change We need to do better in order to learn. HYPOTHESES ASSUMPTIONS A supposition or proposed explanation about the relationship between our activities and their outcomes. Beliefs we hold, without proof, that back up our actions and are accepted as true or certain to happen.

  25. 2. Questions that are powerful and meaningful. Questions that If answered, would make a difference in the way we do the work. EFFECTIVE • Relevant to the people doing to work. • When answered, it is clear how the information can be used and by whom. • Specific enough that it provides sufficient direction. • Understandable and clear - not so complex it is open to widely different interpretations. • Developed in consultation with those involved in answering the question. • Likely to stimulate fresh or innovative thinking or approaches. • Worth the effort to answer. • How do we deploy our resources in a way that improves nonprofit adaptiveness and resilience? • Test hypotheses about multi-year funding and general operating support LESS EFFECTIVE Did nonprofit resilience increase among grantees? What are we learning about scale?

  26. 3. Clear approaches or mechanisms for formally or informally gathering high-quality data. MONITORING RESEARCH EVALUATION Systematic Follows a plan or established method Rigorous Differentiates itself from observations, examines context Avoids cognitive traps Attends to biases in information gathering

  27. 4. Relentless attention to causal inferences. We don’t get to application and action without attention to our causal inferences. Just paying attention to indicators isn’t enough.

  28. 5. Integrated opportunities for reflective analysis and adaptive decisions. How do you embed and institutionalize analysis and adaptation so everyone owns it? All stakeholders need this orientation— board, leadership, program staff, grantees Source: William and Flora Hewlett Foundation (2016). A practical guide to outcome-focused philanthropy.

  29. Foundations are making different choices about what to prioritize or where to start. 1. Thinking that is visible and testable. 2. Questions that are powerful and meaningful. 3. Clear approaches or mechanisms for formally or informally gathering high-quality data. 4. Relentless attention to cause-and-effect. 5. Integrated opportunities for reflective analysis and adaptive decisions.

  30. We are theory-building together. Which of these practices resonate as critical to high-quality, actionable, sustained learning? What else should be on this list? What signals would you look for to know that these practices are high-quality and leading to learning? 

  31. Implications for Foundation Practice

  32. “I’m trying to transfer the power for learning to others. Sometimes I do this using structure, sometimes process, and sometimes without people even knowing it’s happening.” --Ted Chen, Margaret A. Cargill Philanthropies

  33. You have different possible points of intervention. Culture, norms, infrastructure • Organization Group processes, roles, participation, interactions • Group Know-how, skills, perceptions, behaviors • Individual

  34. And different types of tactics or levers. Tools Infrastructure • Learning agendas • Grants databases • Technology • Physical space Processes Training/Coaching • Re-engineering • Events • Facilitation/meetings • Data collection Resourcing Rewards • Embedded staff • Consultants • Recognition • Financial incentives

  35. We also can use techniques to help embed the capacities and habits that lead to high-quality, actionable, sustained learning. Our perception of what others are doing, or what others approve of, can be a strong influence on behavior. Social Norms Our choices are impacted by our perception of ourselves and our roles. Primed to consider a specific identity, we often behave in ways that fit with its associated stereotypes. Identity Rapid feedback loops that provide people with information about their actions in real time (or close to it) are influential in getting people to change or continue those actions. Quick Feedback

  36. Closing: Parting Challenges

  37. To you as influential learning leaders in your foundations. • Continue to theory-build on what it takes to get high-quality, actionable, sustained learning. • Be explicit about your strategies to support learning. Share your strategies with others—get their input. • Evaluate your strategies—feed your own learning with testing and feedback—and share what you find. • Experiment with ways of helping audiences learn together.

  38. To you as an influential network of learning leaders. What can you do at the sector level to support more high-quality learning in philanthropy? • Sector • Come to some agreement about what foundations should aim for • (we have been working on this here) • Be transparent with the field about what you are trying and learning • Talk about and spread positive norms around learning • Recognize foundations that excel on learning • Organization • Group • Individual

  39. Thank you to our supporters.

More Related