490 likes | 665 Views
Data-Based Decision-Making: Evaluating the Impact of School-Wide PBIS. Day 2, Section 3 August 18, 2016. Data-Based Decision-Making. Evaluating the Impact of School-Wide Positive Behavior Instructional Supports Adapted from: George Sugai , Rob Horner, Anne Todd, and Teri Lewis-Palmer
E N D
Data-Based Decision-Making:Evaluating the Impact of School-Wide PBIS Day 2, Section 3 August 18, 2016
Data-Based Decision-Making Evaluating the Impact of School-Wide Positive Behavior Instructional Supports Adapted from: George Sugai, Rob Horner, Anne Todd, and Teri Lewis-Palmer University of Oregon OSEP Funded Technical Assistance Center www.pbis.org
Purpose • Examine the extent to which the logic of School-wide Positive Behavioral Instructional Supports (PBIS) fits your real experience in schools • Define the outcomes for School-wide PBIS • Is SWPBIS related to reduction in problem behavior? • Is SWPBIS related to improved school safety? • Is SWPBIS related to improved academic performance
Purpose continued • Define tools for measuring SWPBIS outcomes • Examine a problem-solving approach for using office discipline referral (ODR) data for decision making • Provide strategies for using data for decision-making and action planning
To Improve Schools for Children • Use evidence-based practices • Always look for data of effectiveness • Never stop doing what is working • Implement the smallest change that will result in the largest improvement
Designing School-Wide Systems for Student Success • Tier 3/Tertiary Interventions………………….……..1.5% • Individual students • Assessment-based • High intensity • 1.5%..................................Tier 3/Tertiary Interventions • Individual students • Assessment-based • Intense, durable procedures • Tier 2/Secondary Interventions..................5-15% • Some students (at-risk) • High efficiency • Rapid response • Small group interventions • Some individualizing • 5-15%........................Tier 2/Secondary Interventions • Some students (at-risk) • High efficiency • Rapid response • Small group interventions • Some individualizing • Tier 1/Universal Interventions……80-90% • All students • Preventive, proactive • 80-90%............Tier 1/Universal Interventions • All settings, all students • Preventive, proactive Academic Systems Behavioral Systems
School-Wide Positive Behavior Support Social Competence, Academic Achievement, and Safety OUTCOMES Supporting Staff Behavior Supporting Decision-Making SYSTEMS DATA PRACTICES Supporting Student Behavior
Improving Decision-Making Problem Solution From Problem-solving Information Problem Solution To
Problem-Solving Steps • Define the problem(s) • Analyze the data • Define the outcomes and data sources for measuring the outcomes • Consider 2-3 options that might work • Evaluate each option • Is it safe? • Is it doable? • Will it work? • Choose an option to try • Determine the timeframe to evaluate effectiveness • Evaluate effectiveness by using the data • Is it worth continuing? • Try a different option? • Re-define the problem?
Key Features of Effective Data Systems • Data was accurate • Data are very easy to collect • Data are used for decision-making • Data are available when decisions need to be made • Data collectors must see the information used for decision-making
Guiding Considerations • Use accessible data • Handle data as few times as possible • Build data collection into daily routines • Establish and use data collection as a conditioned positive reinforcer • Share data summaries with those who collect it
Types of Questions • Initial Assessment Questions • What type or which program do we need? • Where should we focus our efforts? • Ongoing Evaluation Questions • Is the program working? • If YES • Do we need this program anymore? • What do we need to do to sustain success? • If NO • Can it be changed? • Should we end the program?
What Data Should Be Collected? • Always start with the questions you want to answer • Make data that will answer your question • Easy, available, reliable • Balance between reliability and accessibility • Systems approach • Consider logistics • Who? When? Where? How? • Two levels • What is readily accessible? • What requires extra resources?
When Should Data Decisions Be Made? • Natural cycles, meeting times • Weekly, monthly, quarterly. annually • Level of system addressed • Individual: daily, weekly • School-wide: monthly, quarterly • District/Region • State-level
Basic Evaluation Questions by School or Program • What does IT look like now? • How would we know if we are successful? • Are we satisfied by how IT looks? • Yes • Celebrate • No • What do we want IT to look like? • What do we need to do to make IT look like that? • What can we do to keep IT like that?
Team Check-In: 5-Minute Activity Need info from Molly
School-Wide PBIS Is School-wide PBIS having a positive influence on school culture? Using Office Discipline Referral Data
Office Discipline Referrals and THE BIG 5! • Examine office discipline referral rates and patterns • Major problem events • Minor problem events • Ask the BIG 5questions • How often are problem behavior events occurring? • Where are they happening? • What types of problem behaviors? • When are the problems occurring? • Who is contributing?
THE BIG 5! What Where When How often Who
Office Discipline Referral Caution • After we have looked at settings • Data reflects 3 factors • Students • Staff members • Office personnel • Data reflects overt rule violators • Data is useful when implementation is consistent • Do staff and administration agree on office-managed problem behavior versus classroom-managed behavior?
Data Collection • Get Out of Jail Free Card • If you build it, they will come • Be clear about the data you want to collect (ODR vs. Problem-solving) • Make sure EVERYONE is clear on the process
Consequences Chart Get chart from someone
SWIS™ Compatibility Checklist for Documenting Office Discipline Referrals Sample Next review date: _______________ Redesign your form until answers to all questions are “Yes.” Readiness requirements 4 and 5 are complete when you have all “Yes” responses.
Sample OfficeReferral Form Sample
Priorities and Rationale • Graphs – know your data • Rate – interpret it correctly • National Trends – comparison
More Data National Data from SWIS™
Interpreting Office Referral Data:Is there a problem? • Absolute level (depending on size of school) • Middle, High Schools (> 1 per day per 100) • Elementary Schools (> 1 per day per 300) • Trends • Peaks before breaks? • Gradual increasing trend across year? • Compare levels to last year • Improvement?
Application Activity: Absolute ValueIs there a problem? Compare with national average: 625/100 = 6.25 6.25 x .92 = 5.75
Trevor Test Middle School 565 Students Grades 6, 7, and 8
` 12:00 Lang. Defiance Disrespect Harass Skip Cafeteria Class Commons Hall
South Whidbey Elementary School 515 Students Kindergarten – Grade 5
SWIS™ Data • South Whidbey Elementary • Determine the type of data • Collection method • Purpose of the data – to ID problems • Office Discipline Referral / PBIS forms • Problem-solving data • Consequences / Interventions • Documentation
What does a reduction of 850 ODRs and 25 suspensions mean? Kennedy Middle School Savings in Administrative Time • ODR = 15 minutes/event • Suspension = 45 minutes/event • 13,875 minutes • 231 hours • 29, 8-hour days Savings in Student Instructional Time • ODR = 45 minutes/event • Suspension = 216 minutes/event • 43,650 minutes • 728 hours • 121, 6-hour school days
Basic School-Wide PBIS Evaluation Questions by School/District/Region Are our efforts making a difference? • How many schools have adopted SWPBIS? • Are schools adopting SWPBIS to criterion? • Are schools who are implementing SWPBIS perceived as safe? • Are teachers delivering instructional lessons with fidelity as planned? • Is SWPBIS improving student outcomes?