1 / 81

Acknowledgements

Coaching At The Building Level Terri Metcalf, MiBLSi Technical Assistance Partner Carrie Peter, MiBLSi Technical Assistance Partner Stephanie Dyer, MiBLSi Content Specialist Coaches ’ Conference 2012. Acknowledgements. The material for this training day was developed with the efforts of….

ngilliam
Download Presentation

Acknowledgements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Coaching At The Building LevelTerri Metcalf, MiBLSi Technical Assistance PartnerCarrie Peter, MiBLSi Technical Assistance PartnerStephanie Dyer, MiBLSi Content SpecialistCoaches’ Conference 2012

  2. Acknowledgements The material for this training day was developed with the efforts of… • Terri Metcalf • Stephanie Dyer • Carrie Peters • Melissa Nantais Content was based on the work of… • Amanda VanDerHeyden • Matt Burns • Dave Tilly • Anita Archer • Charles Hughes • Heartland AEA 11, Johnston Iowa • Douglas D. Dexter • Michael Fullan • Stephen Covey • Fixen and Blasé • Karl Schoemer • Shirley Hord et. al.

  3. BRIDGE Uncomfortable with Analyzing Data

  4. Session Purpose and Outcomes This session will assist coaches in moving school leadership teams into a deeper understanding of data trends, drawing conclusions to assist in problem solving and facilitating tough conversations around data. Coaches will: • Examine common data sources to draw conclusions around trends in data & assist teams with problem solving • Gain new ideas and tools for facilitating tough conversations with teams

  5. Agenda • Bridge Building Strategy 1 • Becoming more proficient with data and troubleshooting • Bridge Building Strategy 2 • Understanding change and tailoring support • Bridge Building Strategy 3 • Control the controllables

  6. Responder cards • When you select your response to the question (next slide): • Does the green light flash at the top of your response card? • YES: Perfect. Your response has been captured. • NO: Your card is probably set to the wrong channel.

  7. Responder CardsPRACTICE QUESTION What did you do this Thanksgiving? A: Travel in Michigan B: Travel in the USA C: Travel outside of USA D: Catch up on work-what Thanksgiving?

  8. If the Green Light Did Not Flash • Click on the channel button. • Type: 44 • Click on channel again.

  9. TOOL: TROUBLESHOOTING GUIDE BRIDGE-BUILDING STRATEGY #1:DECISION-MAKING WITH DATA—BECOMING MORE PROFIENT

  10. Examples of effective RtI use and decision making: Part 1 – Overview. Amanda VanDerHeyden, Ph.D., http://www.rtinetwork.org/essential/assessment/data-based/examples-of-effective-rti-use-and-decision-making-part-1-overview “Successful RtI implementation occurs when the right data are collected, those data are correctly interpreted and acted upon, and solutions are integrated with resource allocation decisions at the system level.”

  11. Question: What group? 1: I most identify with Group 1 2: I most identify with Group 2

  12. MTSS Decision Making • Applies at the System and Student Levels • Process of Asking Questions and Collecting Data to Answer Them • What is the problem? • Why is the problem occurring? • What to do about the problem? • Did that work?

  13. MTSS decision making Some research based criteria: • Screening, progress monitoring, intervention effectiveness Get comfortable with: • Guidelines • Indicators • Flow charts • Checklists

  14. “A major source of potential and likely error is the misinterpretation or misuse of collected data. Even if the data have been correctly collected, multiple decision points represent opportunities for errors that will compromise the adequacy of the final RtI decision.” (VanDerHeyden & Burns, 2010, p. 61)

  15. Building Decision Points Screen:Who is experiencing a problem? Instruction:Core instruction…materials, pacing, active engagement…is it working? Supplemental Intervention: Plan, manage and evaluate adjustments to instruction Evaluate:Did the adjustments work? Are we doing what we said we would do?

  16. Screen:Who is experiencing a problem? Research criteria: Universal screening tools are: • Brief and efficient to administer • Predict whether students are on-track for meeting future performance outcomes (e.g. reading, math) • Valid and reliable • Can be used to identify individual students as well as school-wide need for support

  17. Screen:Who is experiencing a problem? Guideline: Do you have close to 80% or more of your students reaching your screening benchmark goals? If not, focus on Tier 1 or Core Instruction for all. • You cannot intervene your way out of a Tier 1 problem!

  18. Common error: We neglect to ensure the accuracy of our screening data • Accuracy • Fidelity checks! • Coach tip: • In reading, watch for standardized administration of directions! We are better at giving the measures than giving standardized directions. • In SWIS, watch for inconsistencies in majors versus minors, and in correctly identifying student motivation. • Data-entry checks • Training • Coordination for maximum efficiency (materials, schedule, data management, fidelity checks)

  19. Question:Accuracy Checks How systematically (scheduled, multiple points) do you check for accuracy in your screening data? 1 Always 2 Occasionally 3 When we randomly uncover mistakes 4 Almost never

  20. But wait! I have a student who really needs more support . . . True or False Teacher referral (for supplemental support) is an empirically supported universal screening measure.

  21. Instruction • Clearly defined skills • Materials, scope and sequence, consistent instructional routines • Time • Is there enough? • Is it being used efficiently? Is there a calendar (sequence)? • Instructional delivery • Practice, guided feedback, concise interactions, pacing matched to student need • Active engagement strategies • Are students actively engaged? • Do they have frequent opportunities to respond?

  22. 3rd grade example: Two buildings, same district

  23. Impact on coaching Data doesn’t always answer all the questions, but it should help you ask better questions. • What questions do you have? • What trends do you see? • What coaching support would you provide these teams in the fall?

  24. Supplemental Intervention Decision points: • Intervention Selection • Time • Group Size • Intervention Management • Progress Monitoring • Exit Criteria • Evaluation

  25. How do you match students to intervention? • Use sorting tools to triage screening data • Teacher validation (performance observation, other assessments) • Look for error patterns in student screening protocols • Intervention pre-tests (e.g. Rewards, Corrective Reading) • Layer on diagnostic assessments, as needed

  26. Coach Tip:Watch for these judgment errors! Kahneman, et al lists: • Availability • Anchoring • Insufficient adjustment “Professionals may find themselves offering interventions that have been reinforcing to them because of past success”

  27. Example 1

  28. “It turns out that professional adults often do not do what they have agreed to do.”

  29. Question: Intervention Management We monitor the fidelity of our interventions . . . 1 Weekly 2 Monthly 3 When we add a new one or randomly when there is a problem 4 Oops

  30. How do you monitor the fidelity of interventions? Heartland AEA 11, Johnston Iowa http://www.aea11.k12.ia.us/educators/idm/checkists.html

  31. Other checklist example

  32. How long do you intervene? • When students are not responding,first look at the integrity of the intervention implementation • Can’t do or a won’t do? • Frequent opportunities to respond? • Attendance, time, group size, pacing

  33. Guideline • Consensus seems to be 8-12 weeks • Weekly or biweekly progress monitoring, graphed! • Discussion in literature over best method • Benchmark goals • Dual discrepancy • Slope • Median split • Mastery tests

  34. Common practice: Aim line

  35. Progress-Monitoring:Decision Rules • If 2 or 3 of the last data points fall at or above the aim line consider raising the goal or fading the intervention. • If 2 or 3 of the last data points are on the aim line, continue the intervention. • If 2 or 3 of the last data points are below the aim line, modify the intervention.

  36. Progress-Monitoring:Decision Rules If you are monitoring progress monthly does this mean that you wait 2-3 months before making a change? If a student’s progress is being monitored monthly and a data point falls below the aim line, switch to weekly progress monitoring so that you have enough data to make a sound instructional decision in a reasonable amount of time. NO!

  37. Example: Using rate of increase4th grade April Progress Monitoring Meeting

  38. How are you doing? Intervention failure should be minimal . . . If you have large numbers of students not making significant progress in your Tier 2 interventions, check intervention integrity!

  39. What if the intervention isn’t working? • Have evidence of fidelity • Have evidence of adequate time • Have students in attendance When was the last time you stopped running an intervention? How did your team make that decision?

  40. Example 2

  41. (VanDerHeyden & Tilly, 2010, p. 41) “Sites can select perfect screening measures, identify ideal interventions, administer assessments, and conduct interventions without error, but still fail to correctly carry out RtI if correct decisions are not made along the way about who needs intervention, what type of intervention to provide, and when intervention has been successful or not.”

  42. ACTIVITY:Troubleshooting GuideAdditional tool for Building Decisions

  43. BRIDGE-BUILDING STRATEGY #2:understanding change and tailoring support

  44. Question: What group? 1: I most identify with Group 1 2: I most identify with Group 3

  45. Review: Stages of Implementation Should we do it! Work to do it right! Work to do it better!

  46. Review: Stages of Concern Informational Management Collaboration 1 0 2 3 4 5 6 Awareness Personal Consequence Refocusing

  47. Review: Implementation Dip

  48. That’s great, but what does this mean for me as a coach? • Understanding that emotions during change are normal and transient. This helps in supporting yourself and others during change! • Use it to help people maintain perspective, minimize personalization of process, and reduce intensity of negative emotions. • Facilitate the planning of positive actions to accelerate progress through stages • Understanding of stages allows a coach to tailor responses and supports

More Related