1 / 30

Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes

Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes. Using Data to Tell Your Story. Dr. John W. Hodge & a brief statement on using data: http://www.youtube.com/watch?v=N6w-EM-R13I.

Download Presentation

Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data-Based Decision Making:Using Data to Improve Implementation Fidelity & Outcomes

  2. Using Data to Tell Your Story • Dr. John W. Hodge & a brief statement on using data: http://www.youtube.com/watch?v=N6w-EM-R13I

  3. Major portions of the following material were developed by: George Sugai and Rob Horner OSEP Funded Technical Assistance Center www.pbis.org In conjunction with The Iowa Department of Education

  4. Academic Systems Behavioral Systems • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Interventions • All students • Preventive, proactive • Universal Interventions • All settings, all students • Preventive, proactive Designing School-Wide Systems for Student Success 1-5% 1-5% 5-10% 5-10% 80-90% 80-90%

  5. State of Iowa: 2011-12

  6. 7 Basic Evaluation Questions • What does “it” look like now? • Are we satisfied with how “it” looks? • What would we like “it” to look like? • What would we need to do to make “it” look like that? • How would we know if we’ve been successful with “it”? • What can we do to keep “it” like that? • What can we do to make “it” more efficient & durable?

  7. School Data – A Comedy • Dirty Data - a light-hearted view for the purpose of using data to guide development of a school’s PBIS system: http://www.youtube.com/watch?v=XBv95uMFudE

  8. Action Planning with Your PBIS Data • Review TIC charts, SAS results, and SET scores from pbisassessment.org • Identify items not fully in place at your school • Identify items that will make the biggest impact • Define a task analysis of activities to achieve items • Allocate tasks to people, time, reporting event. • Utilize ODR data to strengthen supports, acknowledgment, consequence system, & determine additional teaching needed

  9. State of Iowa:Annual Schedule for PBIS Evaluation Tools

  10. Universal Fidelity Measurements

  11. Universal Fidelity Measurements

  12. Installation = 1st Year of Training

  13. Universal: Initial Implementation = 2nd Year of Training

  14. Universal: Full Implementation = 3rd Year of Training or beyond

  15. PBIS implementation evaluation tools Fidelity Implementation Assessments per Tier (another look)

  16. TIC: Team Implementation Checklist • Completed 3 times per school year (as of 2012-13) • Define items In place, partially in place, or not in place. • Identify the items that will make the biggest impact. • Define a task analysis of activities to achieve. • Allocate tasks to people, time, reporting event. • Every item on the TIC rated as partially implemented or not implemented needs to have a corresponding item on the Action Plan. • Goal • 80% Full Implementation

  17. SAS: Self-Assessment Survey • Self-assessment tool to determine to what extent PBIS practices and systems are in place within a school • Four Systems • School-Wide • Non-classroom settings • Classroom • Individual Student • Goal • 80% In Place School-Wide

  18. SAS: What Information Does It Give Us? • For each of the 4 systems • Current Status • % in place, % partially in place, % not in place • Priority for Improvement • % of High, % of Medium, % of Low • Two main questions • What behavior supports are in place? • What behavior support features are most in need of improvement?

  19. SAS • Completed… • by as many staff members (certified & non-certified) as possible • one time per year during the Feb-May reporting period • on-line at www.pbisassessment.org • Purpose: • To determine if PBIS practices and systems are in place • To determine which behavioral systems need to be addressed (action planning)

  20. SET: School-Wide Evaluation Tool • Designed to assess and evaluate the critical features of effective school-wide behavioral support • Research Validated Instrument of SW-PBIS • outcomes tested for both reliability and validity • Summary Score and seven core feature scores (Horner, et al, 2004)

  21. SET: 7 Core Features • Behavioral expectations defined • Behavioral expectations taught • Behavioral expectations rewarded • Systematic response to rule violations • Information gathered to monitor student behavior • Local management support for SW-PBIS procedures • District level support of SW-PBIS procedures

  22. Completing the SET • Completed by an outside person visiting the school: • During Feb-May reporting period • Approximately 3-4 hours • Interview teaching staff • Interview students • Observe school environments • Review school documents • Interview school administrators • Goal • 80% for expectations taught • 80% overall score

  23. ODR Data: Office Discipline Referral Data • Universal Team • reviews ODR data (BIG 5 in particular) in SWIS during each PBIS Leadership Team meeting (at least monthly) • Analyzes the data to make appropriate system decisions to increase supports and teaching where needed

  24. Using Data to Build SolutionsItems to Remember • Prevention: How can we avoid the problem context? • Who, When, Where • Schedule change, curriculum change, etc. • Teaching: How can we define, teach, & monitor what we want? • Teach appropriate behavior • Use problem behavior as negative example • Recognition: How can we build in systematic acknowledgment for desired behavior?

  25. Using Data to Build SolutionsItems to Remember • Extinction: How can we prevent problem behavior from being rewarded? • Consequences: What are efficient, consistent consequences for problem behavior? • How will we collect and use data to evaluate: • 1) implementation fidelity? • 2) impact on student outcomes?

  26. Cross walking fidelity data & ODR data

  27. Let’s PracticeSmall Group Activity • What you will NEED: • Data Review Worksheet & 7 ?s Analysis Guide • Location: in your individual packets • Data Packet File • Location: center of each table to share in small groups • Steps: • Overview of Data Review Worksheet • Overview of Data included in packet file • Work in groups of 3 or 4 • Study the data & complete the worksheet

  28. Thank you for your efforts to use data to make decisions!

More Related