280 likes | 415 Views
Developing a Comprehensive State-wide Evaluation for PBS. Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D. Objectives. Describe Florida’s evaluation system for state, district, and school levels Identify the critical questions that Texas needs to answer
E N D
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
Objectives • Describe Florida’s evaluation system for state, district, and school levels • Identify the critical questions that Texas needs to answer • Describe a comprehensive model for evaluating Tier 1 PBS • Build a scalable and sustainable system • Review methods of data collection procedures, tools, analysis and training
Purpose of Evaluation • To examine the extent to which teams are accurately selecting and implementing PBS systems and practices • Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved • To determine if teams are accurately and consistently implementing activities and practices as specified in their individualized action plan • (PBIS Blueprint, 2005)
Factors to Consider in Developing Comprehensive Evaluation Systems • Systems Preparation • Readiness activities • Service Provision • Training and technical assistance • Evaluation Process • Timelines • Evaluation Data • Implementation Fidelity, Impact on Students, Attrition, Client Satisfaction • Products and Dissemination • Reports, materials, presentations, etc. (Childs, Kincaid & George, in press)
(1) Systems Preparation • Readiness activities • District Readiness Checklist • District Action Plan • School Readiness Checklist • New School Profile • Baseline data: ODR, ISS, OSS, academic
(2) Service Provision • Training and ongoing technical assistance
(3) Evaluation Process • Timelines for Evaluation Reports • Mid Year I – due 10/31 • School Profile • PBS Implementation Checklist (PIC) • Mid Year II – due 2/28 • PBS Implementation Checklist (PIC) • End Year – due 6/15 • Benchmarks of Quality (BoQ), Bencahmark for Advanced Tiers (BAT) • Outcome Data Summary • School-wide Implementation Factors (SWIF)
Implementation Fidelity PIC BoQ, BAT School Demographic Data SWIF Team Process Survey Attrition Attrition Survey Impact on Students Outcome data (ODR, ISS, OSS) FCAT (state test) School climate surveys Referrals to ESE Screening ID Response to intervention Client Satisfaction SWIF (4) Evaluation Data
(a) Implementation Fidelity • Are schools trained in Universal PBS implementing with fidelity? Tiers 2 and 3? Across years? Across school types? • BoQ, BAT, School Demographic Data • What factors are related to implementing with fidelity? • SWIF survey, BoQ, BAT • Do teams that work well together implement with greater fidelity? • Team Process Evaluation, BoQ
School-Wide Implementation Factors (SWIF) Higher Implementing Lower Implementing (70+ on BoQ) (-70 on BoQ)
Descriptive Data: Teams • Team functioning did not effectively differentiate school teams implementing with high or low fidelity with better or worse outcomes • Teams implementing Tier 1 PBS with fidelity saw substantially different effects on all four outcome measures
(b) Impact on Student Behavior • Do schools implementing SWPBS decrease ODRs, days of ISS, and days of OSS? • ODRs, ISS, OSS • Do schools implementing SWPBS realize an increase in academic achievement? • FCAT scores • Is there a difference in outcomes across school types? • ODRs, ISS, OSS, FCAT scores, school demographic data • Do schools implementing with high fidelity have greater outcomes implementers with low fidelity? • BoQ, ODRs, ISS, OSS • Do teams that work well together have greater outcomes than those that don’t work as well together? • Team Process Evaluation, ODRs, ISS, OSS
Percent change in ODR, ISS and OSS rates per 100 students before and after PBS implementation
Percent decrease in ODR, ISS, OSS rates per 100 students after 1 year of implementation (by school type)
ODRs by implementation level across three years of implementation
(c) Attrition • Why do schools discontinue implementation of SWPBS? • Attrition Survey
(d) Consumer Satisfaction • Are our consumers satisfied with the training, technical assistance, products and support received? • SWIF survey • District Coordinators survey • Training evaluation
(5) Products and Dissemination • Annual Reports • Revisions to Training • Revisions to Technical Assistance process • Dissemination activities: • National, state, district, school levels • Revisions to Website • On-line Training Modules
Improvements Made • Increased emphasis on BoQ results for school and district-level action planning • Increased training to District Coordinators and Coaches and T.A. targeted areas of deficiency based upon data • Team Process Evaluation no longer used • Academic data used to increase visibility and political support • Specialized training for high schools • Identifying critical team variables impacted via training and T.A. activities • Revised Tier 1 PBS Training to include classroom strategies, problem-solving process within RtI framework • Enhanced monthly T.A. activities
Evaluation Process Evaluation Data Products and Dissemination Systems Preparation Service Provision Implementation Fidelity Benchmarks of Quality, BAT School Demographic Data School-wide Implementation Factors Team Process Survey District Action Plan District Readiness Checklist School Readiness Checklist New School Profile (includes ODR, ISS, OSS) Training On-going technical assistance FLPBS ↓ Districts ↓ Coaches ↓ Schools Annual Reports Revisions to training and technical assistance process National, State, district, school dissemination activities Website On-line training modules Impact on Students Outcome data (ODR, ISS, OSS) Florida Comprehensive Assessment Test School Demographic Data Team Process Survey Mid-Year Reports End-of-Year Reports Attrition Attrition Survey Client Satisfaction School-Wide Implementation Factors Florida’s Service Deliveryand Evaluation Model (Childs, Kincaid & George, in press)
In Summary… • Know what you want to know • Compare fidelity of implementation with outcomes – presents a strong case for implementing PBS with fidelity • Additional sources of data can assist a state in determining if PBS process (tiers 1-3) is working, but also why or why not it is working • Address state, district, school systems issues that may impact implementation success
Resources • Childs, K., Kincaid, D., & George, H.P. (in press). A Model for Statewide Evaluation of a Universal Positive behavior Support Initiative. Journal of Positive Behavior Interventions. • George, H.P. & Kincaid, D. (2008). Building District-wide Capacity for Positive Behavior Support. Journal of Positive Behavioral Interventions, 10(1), 20-32. • Cohen, R., Kincaid, D., & Childs, K. (2007). Measuring School-Wide Positive Behavior Support Implementation: Development and Validation of the Benchmarks of Quality (BoQ). Journal of Positive Behavior Interventions.
Contact Heather Peshak George, Ph.D. • Co-PI, Co-Director & PBIS Research Partner Phone: (813) 974-6440 • Fax: (813) 974-6115 • Email: flpbs@fmhi.usf.edu • Website: http://flpbs.fmhi.usf.edu