1 / 20

Project Evaluation Workshop

Massachusetts Board of Higher Education Improving Teacher Quality Higher Education Partnership Program. Project Evaluation Workshop. Hoagland-Pincus Conference Center March 3, 2005. Workshop Objectives. Through an interactive process:

Download Presentation

Project Evaluation Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Massachusetts Board of Higher Education Improving Teacher Quality Higher Education Partnership Program Project Evaluation Workshop Hoagland-Pincus Conference Center March 3, 2005

  2. Workshop Objectives • Through an interactive process: • Develop shared understanding of minimal expectations/standards for TQ project evaluations and reporting • Work towards a common conceptual model of professional development as basis for common approach to program evaluation • Identify next steps for technical assistance with evaluation plans • Support networking among projects to facilitate sharing of evaluation approaches, measures, tools

  3. Challenges • Diversity among projects • Content areas • Target audiences • PD delivery mechanisms • Duration • Current stage of projects (new vs. ongoing) • etc., etc. • Resource limitations

  4. Agenda • Introductions • BHE’s Evaluation and Reporting Expectations • State-level Data Collection • Logic Model of Professional Development • Planning for Project Evaluation • Next Steps

  5. Who We Are: Background • UMass Donahue Institute: • Public service and outreach arm of the UMass President’s Office • Broad range of services to federal, state, local public and non-profit organizations; services include applied social science research and program evaluation • Experience evaluating large statewide education reform initiatives for the Massachusetts Department of Education and Board of Higher Education • Experience developing and implementing evaluation plans for professional development and other educational programs and interventions at the local, regional, and state levels

  6. Who We Are: TQ Evaluation Team • Eric Heller, UMass Donahue Institute Director of Research and Evaluation eheller@donahue.umassp.edu 413-587-2402 • Christine Lewis, Research Manager Lead Project Manager for TQ Evaluation clewis@donahue.umassp.edu 413-587-2409 • Jean Supel, Research Manager Co-Project Manager for TQ Evaluation jsupel@donahue.umassp.edu 508-856-1210

  7. Who We Are: Functions of TQ Evaluation Team • Coordinate state-level collection of standardized project data and reports on behalf of Board of Higher Education • Provide technical assistance to projects in support of quality project evaluation efforts • Develop state level project report through aggregation of project data and meta-analysis of project reports

  8. Who You Are: Introductions and Project Overview • Subject Matter (e.g. Math, Language Arts, Math/Science, etc.) • Project Length • New or Existing Project • Target Population • Single or multiple cohort(s) • Teachers and/or Paraprofessionals; other • # of participants anticipated • School district(s) Involved • Grade level(s) • Type(s) of Professional Development Activities (e.g. Summer Institute, Graduate level course, After school group, classroom support, etc.) • Brief Description of Activities

  9. Evaluation & Reporting Expectations - 1 • Develop and Implement a Project Evaluation that addresses: • Formative Evaluation Objectives • Provide timely feedback on project activities • Identify strengths and weaknesses • Identify gaps, unmet participant needs • Support continuous improvement of content and delivery • Summative Evaluation Objectives • Based on logic model of professional development • Document project implementation model (for replication) • Measure participation levels • Measure short-term participant outcomes (required) • Measure longer-term participant outcomes (to the extent feasible) • Measure student outcomes (to the extent feasible)

  10. Evaluation & Reporting Expectations - 2 • Activity Tracking and Reporting • Target Audience • # Participants • Subject • Grade Level • Duration (# hours) • Timespan • # Credits Note: ACTIVITY data for each completed PD activity will be collected annually through BHE online system

  11. Evaluation & Reporting Expectations - 3 • School Tracking and Reporting • School Name • District • School Type (public/private/charter) • Poverty Level • # Participants by • Role (teacher/paraprofessional/administrator/other) • Grade level taught Note: SCHOOL data for all participants will be collected annually through BHE online system

  12. Evaluation & Reporting Expectations - 4 • Participant Tracking and Reporting • Basic standardized demographic and educational descriptive data to be collected from all participants in all activities. (See enclosed sample participant survey) • Form may be used as is, or incorporated into customized local survey • Electronic versions of survey to be distributed via email following workshop (PDF and Word) • Optional: Use of Participant Code to assist in participant tracking, linking to other local evaluation data Note: PARTICIPANT data for every participant will be collected annually through BHE online system

  13. Evaluation & Reporting Expectations - 5 TQ Program Reporting Requirements • Annual report (multi-year projects) describing status, progress, milestones of project activities; includes evaluation progress report, with interim findings • Final report (all projects) summarizing project activities/milestones and final evaluation report • Evaluation reports (interim and final) to include project objectives, evaluation questions, methodology and results • General template for organizing evaluation report will be provided to facilitate state-level meta-analysis • Report to be submitted electronically

  14. Standard Logic Model of Professional Development Improved Student Outcomes PD and Related Support Activities Improved Instruction Growth in Participant Skills / Knowledge

  15. Planning for Project Evaluation - 1 Logic Model Step 1 – PD and Related Support Activities Sample Evaluation Questions: • To what extent have project activities been implemented as planned? What implementation challenges were encountered and how were they addressed? • Who participated and to what extent? In workshops? In follow up support activities? • To what extent were participant expectations/needs met? • How did participants perceive the quality of the activities? Sample Data Sources: • Activity and participant tracking system • Staff interviews • Participant feedback – surveys, interviews, focus groups

  16. Planning for Project Evaluation - 2 Logic Model Step 2 – Growth in Participant Knowledge/Skills Sample Evaluation Questions: • To what extent do participants achieve the stated learning objectives of PD activities? • To what extent do participants retain or deepen their understanding of concepts learned following PD? Do follow-up/support activities lead to enhanced understanding of concepts? • Do participants experience other benefits/outcomes (e.g., self-confidence? Sample Data Sources: • Pre-test/post-test of PD content (short-term outcomes) • Follow-up administration of test (longer-term outcomes) • Other indicators of mastery – e.g., course assessments • Pre/post survey of attitudes, beliefs

  17. Planning for Project Evaluation - 3 Logic Model Step 3 – Improved Instruction Sample Evaluation Questions: • To what extent do participants’ instructional practices change as a result of participation? • To what extent do participants incorporate material gained through PD into curriculum? • What challenges do participants encounter as they attempt to implement new skills, contents, approaches? How are challenges addressed? Sample Data Sources: • Interviews, focus groups • Participant survey of self-reported changes • Classroom observation

  18. Planning for Project Evaluation - 4 Logic Model Step 4 – Improved Student Outcomes Sample Evaluation Questions: • To what extent do students of participants achieve improved outcomes related to observed changes in instructional practices? Affective (attitudinal)? Cognitive? Sample Data Sources: • Standardized assessments • Classroom assessments • Teacher perceptions (survey)

  19. Moving Forward • Our role is to work with you and your evaluator (internal or external) to implement an evaluation plan that: • Includes both formative and summative evaluation questions • Tracks activity, school, and participant data required for annual reporting • Has data collection and analysis organized around a basic logic model of professional development • Includes appropriate outcomes that measure targeted phases of the logic model

  20. Next steps . . . • Projects complete evaluation plan summaries and submit to TQ Evaluation Team (today or via email within 1 week) • TQ Evaluation Team reviews evaluation plans • We will be in contact with you with feedback, questions, or possible suggestions – by phone, email, or on-site work session • Send us updated evaluation plan summaries as you further develop and/or revise you plan (email attachment) • Contact us with any questions or concerns

More Related