1 / 51

Planning with Data

Planning with Data. May 25, 2011- Louisville, KY Jenell Holstead Megan Weikel Danielle Neukam. Workshop Objectives:. Consider aspects of change and ways to stimulate a successful change initiative. Read and interpret program quality data.

lilian
Download Presentation

Planning with Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning with Data May 25, 2011- Louisville, KY Jenell Holstead Megan Weikel Danielle Neukam

  2. Workshop Objectives: • Consider aspects of change and ways to stimulate a successful change initiative. • Read and interpret program quality data. • Create an effective improvement plan for your program, based on data. • Develop strategies for taking the plan back to your organization.

  3. Youth Program Quality Intervention (YPQI) ASSESS PLAN IMPROVE Conduct SA(Local Self-Assessment Team) Createimprovement plan (Local Self-Assessment Team) Carry outimprovement plan (Program Director, site managers, program staff) Conduct EA(External assessor) (repeat) Center Profile (CEEP)

  4. Workshop Agenda: 9:15-10:00 The change process and leading change 10:00-10:15 BREAK 10:15-11:30 Review aggregate and site-level data 11:30-1:00 LUNCH 1:00-2:00 Developing Program Improvement Plans 2:00-2:30 Implementation of Plans / Reporting 2:30 Closing/Evaluations

  5. Activity: Leading Change

  6. Stages of Change: Where are you? Termination Maintenance Action Preparation Contemplation 6 Precontemplation

  7. Action Preparation Contemplation Where does change happen?What does this have to do with Program Quality? Kentucky Department of Education Policy Context District Coordinators Organizational Setting Program Managers/Staff Instructional Setting Youth

  8. Break

  9. Reading the Data 1. Self Assessments 2. External Assessments 3. Site Visits (CEEP) 4. Center Profiles

  10. 1. Self Assessment Report • All sites that entered YPQA results into Scores Reporter • will have self assessment reports. If you did both a pink YPQA and a purple School-Age PQA, you will have both results.

  11. YPQA: The Pyramid of Program Quality Plan Make choices Engagement Reflect Lead and mentor Be in small groups Partner with adults Interaction Experience belonging Encouragement Reframing conflict Supportive Environment Skill building Session flow Active engagement Welcoming atmosphere Psychological and emotional safety Safe Environment Program space and furniture Emergency procedures Healthy food and drinks Physically safe environment Youth Voice and Governance Professional Learning Community

  12. Self Assessment Data • Each site has self-assessment results from data entered into Scores Reporter Keep in mind… • Observation scores represent a snapshot – this has limitations and value. • These are aggregate scores from multiple observations. • The overall story is more important than the individual numbers. • What you do with the data matters most.

  13. Self Assessment: Aggregate Data

  14. 2. External Assessment Report • Cycle Seven sites that were part of the YPQAProcess in 2010 will have External Assessment results (which appear along-side self-assessment results).

  15. External Assessment Data • External assessments were conducted for 17 sites during the spring 2011 site visits. Keep in mind… • External assessment scores are always lower than self assessment scores. • Observation scores represent a snapshot – this has limitations and value. • The overall story is more important than the individual numbers. • What you do with the data matters most.

  16. 3.Site Visit Reports (CEEP) 37 visits to KY 21st CCLC programs between February 16 and April 13 Site Visit Activities Included: • Site coordinator interview • School day teacher interview • Standardized observation protocol for academic and enrichment activities Rating System: • 12 Items (rated on a scale of 1 to 4) • 1 = Must Address and Improve • 2 = Some Progress Made • 3 = Satisfactory • 4 = Excellent • 48 possible points

  17. Site Visits (CEEP) Purpose of 2011 Site Visits High School Programs Activities promote academic growth, remediation, and development Links to the regular school day Participants contribute ideas, make choices, and having positive experiences Establish partnerships and employ successful recruitment strategies Elementary/Middle School Programs Activities geared toward rigorous academic enrichment Links to the regular school day Individual support and opportunities for positive interactions for youth Relationships with schools, parents, and other community constituents

  18. Elementary and Middle School Site Visit Results Focus Area 1 & 2: Activities Geared Toward Rigorous Academic Achievement Links to the School Day Focus Area 1 Focus Area 2

  19. Elementary and Middle School Site Visit Results Focus Area 3 & 4: Opportunities for Individual Support and Positive Interactions Relationships with Schools, Parents, and Community Organizations Focus Area 3 Focus Area 4

  20. High School Site Visit Results Types of Activities Offered

  21. High School Site Visit Results Focus Area 1 & 2: Activities Promote Academic Growth, Remediation, and Development Links to the School Day Focus Area 1 Focus Area 2

  22. High School Site Visit Results Focus Area 3 & 4: Participants Contribute Ideas, Make Choices, and Have Positive Experiences Establishes Partnerships and Maintains Successful Recruitment Strategies Focus Area 3 Focus Area 4

  23. 4. Center Profiles • Each site has a Center Profile from the 2009-2010 School Year (that offered programming that year) • Data elements included in analyses • Student participation • Outcomes for regular attendees • Grades • Teacher Surveys • Outcomes for regular attendees who struggle academically • Grades

  24. 2009-2010 Center Profile Data: Student Participation

  25. 2009-2010 Center Profile Data: Academic Outcomes

  26. 2009-2010 Center Profile Data:Academic Outcomes for Struggling Students

  27. We have plenty of data…what do we do with them? • Program activities/operations? • Safe environment? • Links to the school day? • Attendance patterns? • Classroom behavior? • Program behavior? • Grade Changes? Reading or math? • Relationships among youth? • Relationships among youth/adults? • Engagement with program? Step 1: Identify Program Successes Step 2: Identify Areas for Improvement

  28. Identifying Successes and Challenges Using the worksheet provided in your folder, determine your program’s strengths and weaknesses based on the data provided.

  29. Reviewing Program Successes and Challenges • Were you able to identify program successes? • Were you able to identify program challenges? • Were data consistent across data elements?

  30. Putting it all together: Create the story of your data… • What is the message or story of your data? What do the numbers tell you? • What’s missing from the data? What important things about program quality do not come through? • Where are the gaps between what you want to provide and what the data says you’re providing?

  31. Anybody hungry?? Break for Lunch: 11:30 am – 1:00 pm

  32. Completing the Program Improvement Plan Two copies of the Program Improvement Plan template have been included in your folder. Additional copies are available if you need them.

  33. Completing the Improvement Plan • Step 1: Enter District Name and Program Site • Step 2: Using the data from your folder and worksheet activity, develop one goal • Justify this goal by listing the related data element(s) that identified this area as needing to be strengthened.

  34. Completing the Improvement Plan: GOALS • When developing goals, remember: • Goals should be broad statements…but not too broad! • Examples of good goals: • Purposefully connect the afterschool program to the school day. • Provide opportunities for youth to reflect on their experiences in the after school program. • Provide activities geared towards improving reading skills • Examples of not so good goals: • Improve academic performance • Increase student engagement

  35. Completing the Improvement Plan: OBJECTIVES • Step 3: Develop objectives for the first goal

  36. Program Improvement Plan objectives should each be SMART: • Specific • Measurable • Attainable • Relevant • Timelined

  37. SMART Objectives…are SPECIFIC Not very specific… • Youth will have more opportunities to enhance skills in reading and math. Getting better… • Youth will engage in literacy-based activities during each week. Even better… • Literacy-based activities will be provided to students on a minimum of three days per week. Ideally, you have this level of specificity… • By November 2011, literacy-based activities will be provided to students on a minimum of three days per week.

  38. SMART Objectives…are Measurable Which of the following objectives is most measureable? • Staff will use student achievement data to plan topics for tutoring sessions. • By May 2012, staff will hold at least three quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • At least half of tutoring sessions provided will be based on individual student needs • Staff members will increase their awareness of students’ individual academic needs.

  39. SMART Objectives…are Attainable Which of the following objectives is most likely to be attained? • All program activities will be planned and implemented with student input. • By May 2012, all program activities will involve opportunities for students to share their work with other participants. • All students will have opportunities for reflection during program activities on a daily basis. • By May 2012, students will have opportunities for reflection during at least two program activities per week.

  40. SMART Objectives…are Relevant Which of the following objectives are most relevant to the following goal? GOAL: Increase the proportion of program activities provided that are directly aligned with academic standards. OBJECTIVES: • By May 2012, all program activities will involve opportunities for students to share their work with other participants. • By November 2011, literacy-based activities will be provided to students on a minimum of three days per week. • By May 2012, students will participate in at least four activities per week (outside of homework help) that are intentionally linked to Kentucky state standards.

  41. SMART Objectives…are Timelined Which timeline seems most useful? • By spring 2012, all program activities will involve opportunities for students to share their work with other participants. • Next year, staff will hold quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • Beginning in October 2011, staff will hold quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • From December 2011 through April 2012, students will have weekly opportunities to reflect on program activities.

  42. Completing the Improvement Plan: OBJECTIVES • Step 4: Double-check each objective to ensure it meets the criteria for SMART objectives.

  43. Completing the Improvement Plan: MEASURING PROGRESS • Step 5: Indicate when progress will be measured and what will be done to measure progress

  44. Completing the Improvement Plan: ACTIVITIES • Step 5: Copy each objective to the chart on page 2 • Step 6: List 3 activities that will be conducted in order to meet the objectives • Activities should be specific and include timelines

  45. Completing the Improvement Plan • Step 7: Rinse and Repeat! • Follow the same steps for Goal 2 on pages 3 and 4 • Step 8: Type Program Improvement Plan using the electronic copy we’ll email you. Email to Megan Weikel at mmweikel@indiana.edu by June 10, 2012. • Step 9: Implement the plan!!

  46. Complete Program Improvement Plan: Using the program improvement plan templates provided, brainstorm, write two goals with related objectives, and determine needed activities.

  47. Taking it back • What’s your plan for taking back your plan? • How will you get to the POS (point of service)? • How does your plan address your team needs?

  48. Accountability Behaviors • Resistance - You don’t understand me or the kids I work with.You can’t make me do it! • Survival - This is just the latest fad and it too will pass.If I keep to myself, no one will call me out. • Compliance - We will do the bare minimum to get through licensing, then it’s business as usual. • Engagement - I see the value of this change, and I’m committed to taking advantage of this opportunity.

  49. Next Steps… • 6/10: All Self Assessment Teams complete a Program Improvement Plan and send it to CEEP • 6/15: CEEP will send out 2011 YPQA Self Assessment Evaluation Survey and Program Improvement Plan form to all Self Assessment Teams • 6/30:2010 YPQA Self Assessment Evaluation Survey must be completed by all Self Assessment Teams • December 2011: CEEP will send out Progress Reports for each program site to complete – based on implementation of the Program Improvement Plan.

  50. Evaluations

More Related