480 likes | 635 Views
Returning Team Training. July 17, 2008. AGENDA. Introductions and Celebrations Team Check-up Creative ways to use data: A toolkit for schools Check-in Check-out: Behavior Education Program Action Planning. Introductions & Celebrations. Goals.
E N D
Returning Team Training July 17, 2008
AGENDA • Introductions and Celebrations • Team Check-up • Creative ways to use data: A toolkit for schools • Check-in Check-out: Behavior Education Program • Action Planning
Introductions & Celebrations
Goals • Define the use of data driven decision to reach full implementation of school-wide PBS • Team Implementation Checklist (TIC) • School-wide Evaluation Tool (SET) • Assess implementation level • Determine need for strategies to assist students in the “Yellow Zone”
Assumptions • School teams will be successful if: • They start with sufficient resources and commitment • They focus on the smallest changes that will result in the biggest difference • They have a clear action plan • They use on-going self-assessment to determine if they are achieving their plan • They have access to an external coach who is supportive, knowledgeable and persistent.
Team Implementation Checklist • Self-assessment tool for monitoring implementation of School-wide PBS • Start-Up Elements • Establish Commitment • Establish and Maintain Team • Self-assessment • Establish school-wide expectations • Establish consequences for behavioral errors • Recognize appropriate behavior • Establish information system • Establish capacity for function-based support
Use of the Team Checklist • Who completes the Team Checklist? • The school-team (individually or together) • When is Team Checklist completed? • At least quarterly, best if done monthly • Who looks at the data? • Team • Coach • Coordinator • Action Planning
Measures the level of implementation of SWPBS (not intended to measure everything!) The Critical Features Expectations Defined Expectations Taught System for Encouraging Expected Behaviors System for Discouraging Problem Behaviors Monitoring and Decision Making PBS Team Management District Level Support What does the SET measure?
Why use it? The results help PBS teams: Assess the features of PBS in place Determine annual goals for school-wide positive behavior support evaluate on-going efforts toward school-wide behavior support design and revise procedures as needed compare efforts toward school-wide effective behavior support from year to year
Data Review Worksheet • Review office referrals • Review TIC results • Review SET results • Complete Action Planning Form
Creative ways to use data: A toolkit for schools Susan Barrett sbarrett@pbismaryland.org
Objectives • Review why and how to use discipline data • Provide examples of how CCPS schools use various forms of data to monitor the effectiveness of PBIS • Highlight and demonstrate templates utilized to share information with staff and PBS teams • Determine what barriers to learning we have • Complete an activity to help plan for data-based decision making
Data IS NOT: • A scary or “four letter” word • Should not intimidate us • Just numbers IS: • Powerful when used to discuss discipline • Empowering when used by school teams • Reviewed frequently to determine areas of strength and weakness
Scenarios • You work at an elementary school with 400 students. Upon reviewing data at the end of the year you find that your school had 20 suspensions. • You work at a high school with 1000 students. You have a total of 100 days of suspension during the school year.
Scenarios • You work in a middle school of 650 students. Last school year there were 100 referrals. • You work at an elementary school of 450 students. Last year there were 800 referrals
What impact does it have? • Think about each of the scenarios
Impact • Administrators • Teachers • Staff • Students • Parents • School Climate • Interventions • Support Services needed • Academic Achievement
Improving Decision-Making Solution Problem From Problem Solving Solution Problem To Information
Why Collect Discipline Data? • Decision making • What decisions do you make? • What data do you need to make these decisions? • Professional Accountability • Decisions made with data (information) are more likely to be (a) implemented, and (b) effective
From primary to precise • Primary statements are vague and leave us with more questions than answers • Precise statements include information about 5 “Wh” questions: • What is the problem and how often is it happening? • Where is it happening • Who is engaging in the behavior? • When is the problem most likely to occur? • Why is the problem sustaining?
Primary statement: “There is too much fighting at our school” Precise statement There were 30 more ODRs for aggression on the playground than last year, and these are most likely to occur from 12:00-12:30 during fifth grade’s recess because there is a large number of students, and the aggression is related to getting access to the new playground equipment. “ From primary to precise: An example
Primary statement: “ODRs during December were higher than any month” Precise statement: Minor disrespect and disruption are increasing and are most likely to occur during the last 15-minutes of our classes when students are engaged in independent seat work. This pattern is most common in 7th and 8th grades, involve many students, and appears to be maintained by work avoidance/escape. Attention may also be a function of the behavior- we’re not sure. From primary to precise: An example
Effective Data Systems • The data are accurate and valid • The data are very easy to collect (1% of staff time) • Data are presented in picture (graph) format • Data are current (no more than 48 hours old) • Data are used for decision-making • The data must be available when decisions need to be made (weekly?) • Difference between data needs at a school building versus data needs for a district • The people who collect the data must see the information used for decision-making.
Data Collection • The “Big 5” • Average referrals per day per month • Location • Problem behavior • Student • Time
Summarize the “Big 5” • Is there a problem? • If no, what will we do to sustain our efforts? • If yes, is problem definable or do we need more information? • Next steps • How will we know if it’s working? • Where will we review the data?
Steps to Problem-Solving • Define the problem(s) • Analyze the data • Define the outcomes and data sources for measuring the outcomes • Consider 2-3 options that might work • Evaluate each option • Is it safe? • Is it doable? • Will it work? • Which option will give us the smallest change for the biggest outcome? • Choose an option to try • Determine the timeframe to evaluate effectiveness • Evaluate effectiveness by using the data • Is it worth continuing? • Try a different option? • Re-define the problem?
Interpreting Office Referral Data: Is there a problem? • Absolute level (depending on size of school) • Middle, High Schools (> 1 per day per 100) • Elementary Schools (> 1 per day per 250) • Trends • Peaks before breaks? • Gradual increasing trend across year? • Compare levels to last year • Improvement?
What systems are problematic? • Referrals by problem behavior? • What problem behaviors are most common? • Referrals by location? • Are there specific problem locations? • Referrals by student? • Are there many students receiving referrals or only a small number of students with many referrals? • Referrals by time of day? • Are there specific times when problems occur?
Designing Solutions • If many students are making the same mistake it typically is the system that needs to change not the students. • Teach, monitor and reward before relying on punishment. • An example (hallways)
5:1 Ratio of tickets to referrals • Our data tells us that we should be giving 5 positives to each corrective response • How is that measured? • Number of coupons versus number of referrals.
Triangle of Student Referrals • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity 6+ referrals 1-5% 1-5% • Targeted Group Interventions • Some Students (at-risk) • High Efficiency • Rapid Response 2-5 referrals 5-10% 5-10% • Universal Interventions • All Students • Preventive, proactive 80-90% 80-90% 0-1 referral
Other data to consider • Is our attendance rate improving? • Is our achievement data improving? • How many students are on the honor roll? • Are state tests scores improving? • What is our graduation rate? • How many students are taking AP courses?
What else does the data tell you? • Is there a problem on • Bus • Cafeteria • Hallways • If you have been implementing for many years, are you still seeing the same results? • Are older students still motivated by the same incentives?
Next Steps • Comparing academic and behavior data State-Wide Assessment: Classroom Performance: Discipline: Below grade level Basic 6+ referrals 1-5% 1-5% Borderline Approaching grade level 2-5 referrals 5-10% 5-10% Proficient or Advanced On or above grade level 80-90% 80-90% 0-1 referral
What is the academic/behavior connection in your school? • What information do you need to answer this question? • What types of data do you currently use? • How often? Is it working? • What would make it better? • What are your goals when you leave to return to your building?
Templates • Excel data template • Cost-Benefit Analysis Worksheet
Discipline Data: Essential Questions How do you collect data? What data do you use? What do we do with the data? When do you know you have a problem? How often do you look at your data? How often is discipline data shared with staff? Staff have questions regarding effective discipline strategies What information do you already have? Attendance, suspension, office referrals, achievement scores, tardies, timeout/support room referrals What are the critical discipline issues in your building? Who, What, How Often, When, Where? Discipline Data is collected to answer questions
Discipline Data: Essential Questions How do you know what invention is needed? How many students contribute to your referrals? Are referrals coming from one grade, classroom, or area? Design intervention to target concern What do we measure? How do we measure "it"? How often do we measure "it"? How do we know when we have success? How do we know when we need to make changes? Who do we share it with? How do we share it? Measure success
Resources • www.pbis.org • www.swis.org • www.pbssurveys.org • www.pbismaryland.org “Without data, you’re just another person with an opinion”- Unknown