300 likes | 500 Views
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes. Using Data to Tell Your Story. Dr. John W. Hodge & a brief statement on using data: http://www.youtube.com/watch?v=N6w-EM-R13I.
E N D
Data-Based Decision Making:Using Data to Improve Implementation Fidelity & Outcomes
Using Data to Tell Your Story • Dr. John W. Hodge & a brief statement on using data: http://www.youtube.com/watch?v=N6w-EM-R13I
Major portions of the following material were developed by: George Sugai and Rob Horner OSEP Funded Technical Assistance Center www.pbis.org In conjunction with The Iowa Department of Education
Academic Systems Behavioral Systems • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Interventions • All students • Preventive, proactive • Universal Interventions • All settings, all students • Preventive, proactive Designing School-Wide Systems for Student Success 1-5% 1-5% 5-10% 5-10% 80-90% 80-90%
7 Basic Evaluation Questions • What does “it” look like now? • Are we satisfied with how “it” looks? • What would we like “it” to look like? • What would we need to do to make “it” look like that? • How would we know if we’ve been successful with “it”? • What can we do to keep “it” like that? • What can we do to make “it” more efficient & durable?
School Data – A Comedy • Dirty Data - a light-hearted view for the purpose of using data to guide development of a school’s PBIS system: http://www.youtube.com/watch?v=XBv95uMFudE
Action Planning with Your PBIS Data • Review TIC charts, SAS results, and SET scores from pbisassessment.org • Identify items not fully in place at your school • Identify items that will make the biggest impact • Define a task analysis of activities to achieve items • Allocate tasks to people, time, reporting event. • Utilize ODR data to strengthen supports, acknowledgment, consequence system, & determine additional teaching needed
Universal: Full Implementation = 3rd Year of Training or beyond
PBIS implementation evaluation tools Fidelity Implementation Assessments per Tier (another look)
TIC: Team Implementation Checklist • Completed 3 times per school year (as of 2012-13) • Define items In place, partially in place, or not in place. • Identify the items that will make the biggest impact. • Define a task analysis of activities to achieve. • Allocate tasks to people, time, reporting event. • Every item on the TIC rated as partially implemented or not implemented needs to have a corresponding item on the Action Plan. • Goal • 80% Full Implementation
SAS: Self-Assessment Survey • Self-assessment tool to determine to what extent PBIS practices and systems are in place within a school • Four Systems • School-Wide • Non-classroom settings • Classroom • Individual Student • Goal • 80% In Place School-Wide
SAS: What Information Does It Give Us? • For each of the 4 systems • Current Status • % in place, % partially in place, % not in place • Priority for Improvement • % of High, % of Medium, % of Low • Two main questions • What behavior supports are in place? • What behavior support features are most in need of improvement?
SAS • Completed… • by as many staff members (certified & non-certified) as possible • one time per year during the Feb-May reporting period • on-line at www.pbisassessment.org • Purpose: • To determine if PBIS practices and systems are in place • To determine which behavioral systems need to be addressed (action planning)
SET: School-Wide Evaluation Tool • Designed to assess and evaluate the critical features of effective school-wide behavioral support • Research Validated Instrument of SW-PBIS • outcomes tested for both reliability and validity • Summary Score and seven core feature scores (Horner, et al, 2004)
SET: 7 Core Features • Behavioral expectations defined • Behavioral expectations taught • Behavioral expectations rewarded • Systematic response to rule violations • Information gathered to monitor student behavior • Local management support for SW-PBIS procedures • District level support of SW-PBIS procedures
Completing the SET • Completed by an outside person visiting the school: • During Feb-May reporting period • Approximately 3-4 hours • Interview teaching staff • Interview students • Observe school environments • Review school documents • Interview school administrators • Goal • 80% for expectations taught • 80% overall score
ODR Data: Office Discipline Referral Data • Universal Team • reviews ODR data (BIG 5 in particular) in SWIS during each PBIS Leadership Team meeting (at least monthly) • Analyzes the data to make appropriate system decisions to increase supports and teaching where needed
Using Data to Build SolutionsItems to Remember • Prevention: How can we avoid the problem context? • Who, When, Where • Schedule change, curriculum change, etc. • Teaching: How can we define, teach, & monitor what we want? • Teach appropriate behavior • Use problem behavior as negative example • Recognition: How can we build in systematic acknowledgment for desired behavior?
Using Data to Build SolutionsItems to Remember • Extinction: How can we prevent problem behavior from being rewarded? • Consequences: What are efficient, consistent consequences for problem behavior? • How will we collect and use data to evaluate: • 1) implementation fidelity? • 2) impact on student outcomes?
Let’s PracticeSmall Group Activity • What you will NEED: • Data Review Worksheet & 7 ?s Analysis Guide • Location: in your individual packets • Data Packet File • Location: center of each table to share in small groups • Steps: • Overview of Data Review Worksheet • Overview of Data included in packet file • Work in groups of 3 or 4 • Study the data & complete the worksheet