1 / 70

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?. Rose Iovannone, Ph.D., BCBA-D iovannone@usf.edu . Objectives. Participants will: Describe the purpose of the Technical Adequacy evaluation tool Apply a scoring rubric to case examples

iolani
Download Presentation

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing? Rose Iovannone, Ph.D., BCBA-D iovannone@usf.edu

  2. Objectives • Participants will: • Describe the purpose of the Technical Adequacy evaluation tool • Apply a scoring rubric to case examples • Discuss further use of the evaluation in their settings

  3. Advance Organizer • Essential Features of Tier 3 Behavior (FBA/BIPs) • Review of the Technical Adequacy Evaluation Tool and Rubric • Practice scoring • Discussion of how to use the tool in the future

  4. Context for FBAs/BIPs • FBA/BIP—substantial evidence base • Behavior ‘gold’ standard for nearly 20 years • Systemic and skill issues impeding implementation • Wealth of literature providing evidence-basis • BUT, does not address the contextual fit of FBA in school culture (Scott & Kamps, 2007) • Educators’ willingness and ability to engage in process • Level and intensity of FBA necessary to result in improvements • Conceptually, FBA seen as tool for use in multi-tiered system of supports rather than separate process • If part of process, may change traditional definition of what and who is involved in FBA

  5. Examples of the Problem • Forms vs. skills • “Let’s create new forms” common solution • Paperwork vs. implementation • General vs. individualized • Training vs. coaching • Expert vs. collaborative team model • Separate silos vs. integrated, consistent process • Legalities vs. problem-solving

  6. The Top Twelve List of Things Needed at Tier 3/Individualized Behavior Supports (Iovannone & Kincaid, in prep.) • Multiple levels of Tier 3 • Consistent, fluent process with problem solving-process framework • Collaborative teaming • Problem identification • Data collection, simplified • Linking hypothesis to the FBA • Linking BIP to hypothesis • Multi-component behavior intervention plan matched to classroom context • Task-analyzed strategies • Teacher and classroom coaching/support • Array of outcome measures (child-specific, teacher fidelity, social validity, alliance, fidelity of process, technical adequacy of products) • Maintenance (beyond “warranty”)

  7. 1. Multiple Levels of Tier 3 FBA • Three levels of Tier 3 • Match the level of need to the student • Level 1: Classroom consultation (Facilitator and teacher) • Brief PTR • ERASE (Terry Scott) • Guess and Check(Cindy Anderson) • Level 2: Comprehensive support (e.g., PTR; team-based process) • Level 3: Wrap around with person-centered planning • Tier 3 most effective if Tiers 1 and 2 implemented with fidelity

  8. 2. Consistent Tier 3 Process • Standardized process for ALL students requiring FBAs/BIPs • Incorporates following features: • Identifying students needing Tier 3 • Determining level of FBA support necessary to answer referral concern • Decision points • Timelines between FBA, BIP, Support, Follow-up • Data tracking system • Coaching and fidelity • Flowchart

  9. 2. Consistent Tier 3 Process—Problem Solving Process DEFINE THE PROBLEM What is the behavior of concern? What do we want to see less of? What do we want the student to do more of? PROBLEM ANALYSIS Functional Behavior Assessment Hypothesis EVALUATE Is the plan effective? What are the next steps? DEVELOP AND IMPLEMENT PLAN Behavior strategies linked to hypothesis; coaching/support

  10. 3. Collaborative Teaming • Discontinue expert model – need proficient facilitator to guide team • Three levels of knowledge represented on teams • Knowledge of student • Knowledge of ABA principles • Knowledge of district/campus context • Consensus process established

  11. 4. Problem Identification • Primary problem with many ineffective FBA/BIPs is that the problem is not clearly identified: • Too general • Not defined • Baseline data confirming problem absent • Often, several behaviors listed and unclear which behavior was the focus of the FBA • Not uncommon to see behaviors of concern “change” throughout one FBA/BIP • Need to identify both the replacement behavior to increase as well as problem behavior to decrease—consider broad categories including academic, social, behavior

  12. 5. Simplify Data Collection • Progress monitoring must be: • Feasible • Reliable • Sensitive to change • Flexible to match individual • Standardized (comparable across schools/students/districts) • Direct Behavior Ratings (DBRs) offer a solution • Research supports their effectiveness (see Chafouleas, Riley-Tillman) • LEAP (Phil Strain) • Individualized Behavior Rating Scale (IBRST) used in PTR (Iovannone et al., in press).

  13. Case Study- Mike: Behavior Rating Scale 01/15

  14. BRS Psychometrics (Iovannone, Greebaum, Wang, Kincaid, & Dunlap, in press) • Kappa coefficients of: • Problem Behavior 1 (n = 105): .82 • Problem Behavior 2 (n = 90) : .77 • Appropriate Behavior 1 (n = 103): .65 • Appropriate Behavior 2 (n = 56): .76

  15. Other Uses of BRS • Systemic data tracking method for Tier 3 • Sample system created by: • Cindy Anderson • School district in Florida

  16. 6. Linking the Hypothesis to the FBA • Primary reason FBA is conducted • Hypothesis should be multi-component • When (antecedents) these contextual/environmental events are present……. • It is highly predicted that the behavior identified as the problem and focus of the FBA happens • As a result, the student: • Gets out of or away from activities, people, tangibles, sensory input, pain • Gets activities, people, tangibles, sensory input, pain attenuation • Confirmed by the consequences (what others do in response to the behavior) that typically occur • Method of organizing information • Competing behavior pathway • PTR Assessment Organization

  17. Step 3: Case Study – MikeAssessment Summary Table of Problem Behavior Screaming, Hitting

  18. Step 3: Case Study – MikeAssessment of Appropriate Behavior Prosocial

  19. Mike’s Hypotheses Inappropriate Appropriate

  20. 7. Linking the Hypothesis to the BIP • Other primary purpose of conducting FBA • STOP generating list of general strategies • Each component of hypothesis generates an intervention • Antecedents modified and made irrelevant • Replacement behavior so that problem behavior is ineffective • Functional equivalent reinforcer so the problem behavior is inefficient

  21. 8. Multi-Component Interventions Matched to Classroom Context • Multi-component interventions include prevention, teaching and reinforcement strategies • Team/Teacher(s) select strategies that are • feasible • effective • likely be implemented

  22. 9. Task Analyzed Strategies • Forgotten art • Can’t just say “give choices”, “reinforce appropriate behavior”, etc., “student will comply” • Breaking down the interventions into sequence of steps • Allows teaching with precision • Allows assessment of teacher capacity • Provides foundation for training and for fidelity

  23. Paris—Step 4: PTR Intervention

  24. Case Study Jeff: PTR Intervention Plan

  25. Jeff—Intervention Plan

  26. Jeff—Intervention Plan

  27. Jeff—Intervention Plan

  28. Jeff—Intervention Plan

  29. 10. Teacher and Classroom Coaching, Support • Do not assume teacher/team knows how to implement plan • Schedule 30 minutes to review plan and go over steps • Problem-solve if teacher has difficulties • Modify plan • Choose different intervention • Teach the student the plan

  30. Case Study: Sample Coaching Checklist for Mike

  31. 11. Array of outcome measures (child-specific, teacher fidelity, social validity, alliance, fidelity of process, technical adequacy of products) • Individualized Behavior Rating Scale • Fidelity scores • Social validity- Did teacher like the process, are they likely to use strategies, would they do it again, etc.? • Alliance—Did they like you? Did they feel like you respected their input? Did you do a competent job as a consultant?

  32. PTR Plan Self-Assessment Example for Mike

  33. 12. Maintenance (beyond warranty) • Dynamic process-not static • Decision making process based on data • Determine levels of support needed, fading, shaping, generalizing, extending, etc.

  34. Steps for Evaluating Outcomes • Make sure you have both fidelity measures (self and/or observation scores) AND student outcomes (Behavior Rating Scale measures) • Decision rules • What constitutes adequate fidelity? 80%, 70%, something else? • What constitutes adequate student progress? (e.g., 3 or more consecutive ratings at or above goal line?)

  35. Primary Decisions • If Fidelity scores are inadequate, determine the reasons (intervention too difficult, not feasible, not described adequately….) • Retrain/coach the teacher/implementer • Modify the interventions so that they are feasible, simpler • Select different interventions that match the hypothesis • Student outcomes (decision contingent upon outcome trend) • Maintain intervention • Intensify intervention • Modify intervention • Fade intervention components • Shape behavior outcomes to become closer approximations of desired behavior • Expand the intervention (additional people, additional settings or routines) • Conduct another FBA if hypothesis is suspect, team has new data, or context has changed

  36. Evaluating the Technical Adequacy of FBAs and BIPs

More Related