1 / 39

Finding the Value in E valu ation

This article discusses what Noyce directors should expect from program evaluation, including the importance of a good evaluation plan, tools for evaluation, data collection, and using evaluation results. It also highlights the changed perspective of evaluation and the value it can bring to projects.

ppark
Download Presentation

Finding the Value in E valu ation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finding the Value in Evaluation What should a Noyce director expect from program evaluation? July 2011 Susan Tucker, E&D Associates LLC DavidaFischman, CSU-San Bernardino

  2. Who are we? DavidaFischman: • Research mathematician turned mathematics educator • 17 years teaching pre-service (elementary and secondary) teachers; 10 years working with in-service teachers in small and large grants • Co-designer and Coordinator of CSUSB MA in Teaching Math program Susan Tucker • 25 years as educational program evaluator • 20 years teaching educational program evaluation and working in teacher education programs  • experience as K12 teacher, principal, associate superintendent, university professor, PI and grant director, grant writer

  3. Who is in the room? Complete Mobile Survey #1 • How "old" is your project? • How many grant-funded projects have you managed? • What experience do you have in project evaluation? • What are your goals/expectations of this session? • TEXT to: 96625 • Message: E&D1

  4. Agenda • From the PI perspective • What is program evaluation? • Negotiating a good evaluation plan • Tools for evaluation • Data collection • Using evaluation results • Tips for PIs • Resources

  5. From a PI... A Changed Perspective • First thoughts: 10-12%?? What for?? • Then... • Using new survey of Noyce Scholars and Mentors to modify next year's work • Using formative evaluation from NSF MSP project to inform program decisions on an ongoing basis • Add-on of evaluation of Noyce Scholars and Mentors tool will continue to use to learn about participants needs and made adaptation decisions • Now use surveys for formative assessment also in university classes • Today… • a much better understanding of the value of evaluation, and ways it can improve the project.

  6. Define: “Program Evaluation” • Think and jot down notes: • What do you get from your Noyce evaluation today? • What more do you want? • Changing views of evaluation • Prove vs. Improve

  7. Warm up/Introductions/Review • Who are you? What do you do? • What disciplines, connections, experiences do you bring into evaluation? • How do they help you think about evaluation? • What previous backgrounds or experiences do you bring that might assist you inmaximixing the value of evaluation? • How do you currently think about the role of evaluation in your NOYCE project? • Strengths • Challenges/Frustrations

  8. Agree or Disagree about the characteristics of a good evaluator: • …is part facilitator, part researcher, part manager and part program specialist • … is external to the program being evaluated • … designs an evaluation to determine if a program is well managed • …negotiates questions of relevance to multiple audiences—need to know vs nice to know • …Is collaborative in terms of designing and implementing the evaluation plan • …helps a project reflect on the quality of its objectives • …helps a project look at more than just whether its goals have been met • … helps staff develop a logic model that describes how a program’s components relate to each other and to the overall goals and objectives • …selects/aligns evaluation model to complement the project’s logic model • …develops a plan to determine if a program is meeting its goals & objectives • …is concerned about how useful the evaluation is to project stakeholders

  9. Program Evaluation Defined? “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming” (Patton, M. Q. 2002).

  10. 2Engage stakeholders 1Prepare for the evaluation 3Identify purpose of the evaluation 4Negotiate theright questions 9Disseminate and use the results Does your Evaluation Do this? 5Co-Design the evaluation 8Analyze the data 7Collect the data 6Select and adapt instrumentation 10

  11. What Can an Evaluation Tell Us? • What is working • How to improve • Support for evidence-based decision-making • Are we achieving the results that we were hoping for? • Are results being produced as planned at all levels of our logic model? • Relevance • Do stakeholders care about what we are doing? Are we solving the problem? • Cost-Effectiveness • Are we getting value for money?

  12. Many models of evaluation…a few examples popular in education • Scientific-experimental models • Objectives based research orientation • Management models • Stufflebeam’s CIPP model • Anthropological/Qualitative models • Stake’s responsive model • Looking at intended & unintended outcomes • Participant oriented models • Fetterman’s empowerment model

  13. Qualities of an evaluator • Formal education (evaluation preferably) • Experience (with program improvement) • Evaluation philosophy complements management team and grant’s principles • Communication skills • Recommendations and resume • Understand culture of target population(s)

  14. Start with the Right Questions… • Include questions of relevance to stakeholders. • Explore what makes questions “relevant” • Determine what will be accepted as evidence in seeking answers to the questions • Examine whose voices are heard in the choice of questions and evidence

  15. Task 1: Are the following “good” evaluation questions? • What are the goals and activities of the teacher certification programs from which Noycegrant is housed? • What is the “value added” of your Noyce program? • How do stakeholders perceive the Noyce Program and Noyce recipients? • What are the characteristics of the schools in which Noyce recipients teach? • What are the relationships between characteristics of the Noyce Program, types of Noyce recipients, and recipients’ plans to go into/stay in teaching and leadership roles? • What is the impact of Noyce on teacher: • Recruitment? • Retention? • Teacher effectiveness?

  16. Design the Evaluation Plan • Negotiate plan with multiple stakeholders • #1. Brainstorm what are important questions to ask • #2. Probe rationale/values behind each question • Build design appropriate to both evaluation questions and cultural context • Align evaluation questions to project logic model • #3. Worry last about how & what & when to measure • Seek culturally appropriate methods that combine qualitative and quantitative approaches. • Collect data at multiple points in time, extending the time frame of the evaluation as needed.

  17. Identify quality criteria…some examples • Persistence and success on a STEM trajectory from teacher prep to teaching jobs • Retention • Changes in teacher pedagogy and content knowledge • Willingness to teach STEM classes • Obtain advanced training in teaching in STEM-related areas.

  18. Summative vs. Formative Evaluation

  19. Question Types & Data Techniques 20

  20. Goals, Purposes Main Objective Direct Results Activities Means Match with Evaluation Criteria 22 Impact Relevance Effectiveness Sustainability Replicability Coherence Efficiency Logical structure

  21. Questions @ Questions • What/whose perspectives are represented in the evaluation questions? • What other questions might have been posed? • Whose perspectives are accepted as credible evidence? • Credible to whom? • How well does the time frame in this study match the needs and rhythms of this context?

  22. Mobile survey: #2 • TEXT to: 96625 • Message: E&D1

  23. Agree/Disagree @ evaluators: • …collects both qualitative and quantitative data. • …has a firm grasp on educational research strategies • …collects data that is actionable—answers provide info needed for decision-making • …is seen but not heard except at the end of each year to write annual reports • …designs data-collection forms, procedures and databases to capture and record data collected. • …is culturally competent and responsive to unique needs of a project • …analyzes data in timely ways to help a project improve as it develops • …clearly distinguishes between descriptions and judgments when presenting findings. • …makes recommendations to the program regarding ways to improve • …works with the project staff to disseminate the findings • …asks questions about sustainability and institutionalization early and often

  24. Collect the Data • Be holistic: • collect qualitative & quantitative data • Be responsive to cultural contexts • Tap into internal & external evaluation • Triangulate vs. one-shot data • Usually takes 3-6 months of eval planning & prep before data collection can begin

  25. Analyze the Data • Consider context/inputs and resources as a necessary component of interpretation. • Disaggregate data to examine diversity within groups • Examine outliers, especially successful ones • A “cultural” interpreter may be needed to capture nuances of meaning. • Stakeholder review panels can assist in accurate interpretation • Confirm accuracy of analysis & interpretation before making judgments

  26. Disseminate & Use the Results • Inform a wide range of stakeholders • Cultural sensitivity and responsiveness increases both the truthfulness and utility of the results • Involve/Engage a variety of stakeholders • Find and train information users • Personal Factor greatly accelerates eval use • Make use consistent with the purpose of the evaluation • Situateinterpretation of results • Use results to make decisions about program improvement

  27. Task: • What data collection procedures are you considering when in designing next year’s Noyce evaluation? • Existing data • New data collection plans

  28. Tools that help: • EX 1: Logic modeling • Planning tool • Flow diagram of your program with defined goals, inputs, outputs, and outcomes connected through causal links • Visual representation of what and how a program produces its outcomes

  29. Program Logic Model Resources Activities or inputs Products or outputs Implementation and Management Short-Term (immediate) Outcomes (knowledge, skills, and abilities, changes in the environment) Mid-Term Outcomes (behavior change, application of new skills/tools, impacts on environment) Planning Long-Term Outcomes (results or change/improvement in issue or effectiveness)

  30. Ex 2: Data-wise: 3 stages, 8 steps(Boudett et al 2005) • Stage I: PREPARE • 1 Organize for Collaborative Work • 2 Build Assessment Literacy • Stage II: INQUIRE • 3 Create a Data Overview • 4 Dig into Data • 5 Examine Instruction • Stage III: ACT • 6 Develop an Action Plan • 7 Plan to Assess Progress • 8 Acting and Assessing

  31. EX 3: Some Evaluation Methods

  32. Impacts of Noyce Projects • Number of teachers trained of K-12 science and math • Number of students directly impacted • Number of partner schools involved • Gains in teacher content knowledge • Gains in student achievement • Improved teaching strategies • Increased student achievement • Increased funding for science supplies & equipment for the region’s schools • IHE faculty visiting schools on a regular basis • Creating innovative course processes/materials

  33. Design Challenges • Based on faulty logic • Selected strategy or activities cannot make intended changes • Failure to connect with the target population (s) • Do not reach them • Do not resonate with them • Not understood by them • Failure to be well implemented • Settings inappropriate • Incompatibility between program and delivery setting • Unrealistic (untested) expectations

  34. Challenges: Assessment issues • Measuring problem-solving skill • Statistical Significance • Adequate sample size • Cost of some assessment methods – how much should projects spend? • Details - effective controls in matched comparisons • Longitudinal effects may be important but realized “down the road” • Tests are not always the best measure of student achievement • Hard to “analyze” qualitative impact data • Standard test culture is focused on “factual knowledge”

  35. Tips for Noyce Evaluators • In order to get the most out of program evaluation, you need to figure out what questions do you want answered. • The goals of each component of the evaluation process need to be clear in your mind; • often these can be negotiated with your evaluator, but certainly they should be laid out clearly and discussed. • PI-Evaluator team – review evaluation plan annually • Use the information you get! • Even if it seems that you have wonderful rapport with participants, they might look at things differently when a third party asks the questions, and you'll learn more.

  36. References • Boudett, K., City, E, and Murnane, R. J., Eds. (2005). Data Wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press. • Gadja, R Community of Practice Collaboration Rubric: www.asdk12.org/depts/cei/about/communities/CollaborationRubric.pdf • Kirkpatrick, D.L. and J.D. Evaluating Training Programs, 3rd Ed., Berrett-Koehler Publ., Inc. San Francisco, CA, 2006 • Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd Ed.) p. 10. Thousand Oaks, CA: Sage Publications. • Patton’s (2003) Qualitative Evaluation Checklist: www.wmich.edu/evalctr/checklists • Scriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.), Curriculum evaluation. American Educational Research Association Monograph Series on Evaluation, No. 1, pp. 39-83. Chicago: Rand McNally. • Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation (3rd Ed.) pp5-60. San Francisco: Jossey-Bass.

  37. References • W.K. Kellogg Foundation Evaluation Toolkit: http://ww2.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&LanguageID=0 • W. K. Kellogg Foundation Evaluation Hand Book. (1998). http://www.publichealth.arizona.edu/chwtoolkit/PDFs/Logicmod/chapter1.pdf • The Centers for Disease Control and Prevention provides a set of evaluation resources in a variety of topical areas, available at: http://www.cdc.gov/eval/resources.htm. • Program Development and Evaluation (University of Wisconsin-Extension) http://www.uwex.edu/ces/pdande/evaluation/ • Worthen, B. R., Sanders, J. R., and Fitzpatrick, J. L. (1997). Program evaluation: Alternative approaches and practical guidelines. (2nd Ed.) p.7. New York, Longman Publishers.

More Related