1 / 79

Workshop Agenda

Classroom Management: Identifying & Teaching Student Replacement Behaviors Jim Wright www.interventioncentral.org. RIOT/ICEL Framework to Organize Student Data. 5-Step Process for Defining Problem Behaviors. Common Pitfalls in Behavior Interventions. Internet Resources. Workshop Agenda.

amal
Download Presentation

Workshop Agenda

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classroom Management: Identifying & Teaching Student Replacement Behaviors Jim Wrightwww.interventioncentral.org

  2. RIOT/ICEL Framework to Organize Student Data • 5-Step Process for Defining Problem Behaviors • Common Pitfalls in Behavior Interventions • Internet Resources Workshop Agenda

  3. Resources from this workshop can be downloaded from: • http://www.interventioncentral.org/RCSD.php

  4. “This workshop will build on the material presented this summer, with the focus on positive behavioral intervention strategies and the teaching of replacement behaviors.  Your team will bring a BIP for one student, along with the assessment data collected for that BIP.” Source: Linda Blankenhorn, Executive Director of Specialized Services, Rochester City School District. 12 Nov 2009 Principals’ Letter

  5. RTI: Listening to the ‘Teacher’s Voice’…

  6. What is the Logic of the Functional Behavior Assessment (FBA) That Can Help to Solve Student Problem Behaviors?

  7. Essential Elements of the Functional Behavioral Assessment (FBA) “Functional assessment is a collection of methods for obtaining information about antecedents…, behaviors…, and consequences… The purpose is to identify the reason for the behavior and to use that information to develop strategies that will support positive student performance while reducing the behaviors that interfere with the child’s successful functioning.” Source: Witt, J. C., Daly, E. M., & Moell, G. (2000). Functional assessments: A step-by-step guide to solving academic and behavior problems. Longmont, CO: Sopris West..pp. 3-4.

  8. Essential Elements of the Functional Behavioral Assessment (FBA)(Cont.) “From this definition, several things are clear. First, functional assessment is not a single test or observation. It is a collection of methods involving a variety of assessment techniques, including observations, interviews, and review of records, that are conducted to acquire an understanding of a child’s behavior. Second, the definition clarifies exactly what is assessed—that is, the child’s behavior as well as what happens just before the behavior occurs and what happens as a result of the behavior. Third, the definition states clearly the goal of functional assessment, which is to identify strategies and interventions to help the child.” Source: Witt, J. C., Daly, E. M., & Moell, G. (2000). Functional assessments: A step-by-step guide to solving academic and behavior problems. Longmont, CO: Sopris West..pp. 3-4.

  9. Factors Influencing the Decision to Classify as BD (Gresham, 1992) Four factors strongly influence the likelihood that a student will be classified as Behaviorally Disordered: • Severity: Frequency and intensity of the problem behavior(s). • Chronicity: Length of time that the problem behavior(s) have been displayed. • Generalization: Degree to which the student displays the problem behavior(s) across settings or situations. • Tolerance: Degree to which the student’s problem behavior(s) are accepted in that student’s current social setting. Source: Gresham, F. M. (1992). Conceptualizing behavior disorders in terms of resistance to intervention. School Psychology Review, 20, 23-37.

  10. ‘Big Ideas’ in Student Behavior Management

  11. Big Ideas: Similar Behaviors May Stem from Very Different ‘Root’ Causes (Kratochwill, Elliott, & Carrington Rotto, 1990) • Behavior is not random but follows purposeful patterns.Students who present with the same apparent ‘surface’ behaviors may have very different ‘drivers’ (underlying reasons) that explain why those behaviors occur.A student’s problem behaviors must be carefully identified and analyzed to determine the drivers that support them. Source: Kratochwill, T. R., Elliott, S. N., & Carrington Rotto, P. (1990). Best practices in behavioral consultation. In A. Thomas and J. Grimes (Eds.). Best practices in school psychology-II (pp. 147=169). Silver Spring, MD: National Association of School Psychologists..

  12. Common ‘Root Causes’ or ‘Drivers’ for Behaviors Include… • Power/Control • Protection/Escape/Avoidance • Attention • Acceptance/Affiliation • Expression of Self • Gratification • Justice/Revenge Source: Witt, J. C., Daly, E. M., & Moell, G. (2000). Functional assessments: A step-by-step guide to solving academic and behavior problems. Longmont, CO: Sopris West..pp. 3-4.

  13. Teacher Referral Example… “Showed disrespect towards me when she yelled inappropriately regarding an instruction sheet. I then asked her to leave the room. She also showed disrespect when I called her twice earlier in the class to see her report card grade.”

  14. B C Big Ideas: Attend to the Triggers and Consequences of Problem Behaviors (Martens & Meller, 1990) • Intervening before a student misbehaves or when the misbehavior has not yet escalated increases the likelihood of keeping the student on task and engaged in learning. Consequences of behaviors that are reinforcing to the student will increase the occurrence of that behavior. ABC Timeline A Source: Martens, B.K., & Meller, P.J. (1990). The application of behavioral principles to educational settings. In T.B. Gutkin & C.R.Reynolds (Eds.), The handbook of school psychology (2nd ed.) (pp. 612-634). New York: John Wiley & Sons.

  15. Big Ideas: Behavior is a Continuous ‘Stream’ (Schoenfeld & Farmer, 1970) • Individuals are always performing SOME type of behavior: watching the instructor, sleeping, talking to a neighbor, completing a worksheet (‘behavior stream’). • When students are fully engaged in academic behaviors, they are less likely to get off-task and display problem behaviors. • Academic tasks that are clearly understood, elicit student interest, provide a high rate of student success, and include teacher encouragement and feedback are most likely to effectively ‘capture’ the student’s ‘behavior stream’. Source: Schoenfeld, W. N., & Farmer, J. (1970). Reinforcement schedules and the ‘‘behavior stream.’’ In W. N. Schoenfeld (Ed.), The theory of reinforcement schedules (pp. 215–245). New York: Appleton-Century-Crofts.

  16. Big Ideas: Academic Delays Can Be a Potent Cause of Behavior Problems (Witt, Daly, & Noell, 2000) Student academic problems cause many school behavior problems. “Whether [a student’s] problem is a behavior problem or an academic one, we recommend starting with a functional academic assessment, since often behavior problems occur when students cannot or will not do required academic work.” Source: Witt, J. C., Daly, E. M., & Moell, G. (2000). Functional assessments: A step-by-step guide to solving academic and behavior problems. Longmont, CO: Sopris West, p. 13

  17. Direct Instruction & Behaviors

  18. Applying ‘RTI Logic’ to Social Behavior Support (Fairbanks, Sugai, Guardino, & Lathrop, 2007) Tier I (‘Universal System’) for behavioral support: • Is implemented schoolwide for all students • Requires that the school "identify and explicitly teach” schoolwide expectations • Includes a system to "acknowledge expectation-compliant behavior" • Defines inappropriate behaviors and applies consequences for those behaviors with consistency • Reviews group progress toward schoolwide goals (data collection and feedback) Source: Fairbanks, S., Sugai, G., Guardino, S., & Lathrop, M. (2007). Response to intervention: Examining classroom behavior support in second grade. Exceptional Children, 73, p. 289.

  19. Using a Direct Instruction Approach to Teaching Replacement Behaviors • Describe to the student the expected replacement behavior that the student is to engage in. • Provide a series of examples of the replacement behavior. • Provide immediate positive feedback to the student for appropriate demonstration of the replacement behavior. • Ensure that the instructional environment supports and rewards expected behaviors.

  20. Common Reasons Why Behavior Plans Fail • Student problems are defined in vague rather than specific terms, making it more difficult to select the right intervention(s) to support the student. • The problem behavior is viewed as residing primarily within the student, causing schools to overlook the important positive impact that they can have on students by changing instruction, work (curriculum) demands, and the learning environment. • The school selects an incorrect hypothesis about what is supporting the student’s problem behavior, so the strategies to promote the positive, replacement behavior don’t work.

  21. Common Reasons Why Behavior Plans Fail • The student’s problem behavior continues, even after the replacement behavior has been taught-- because antecedents (triggers) and / or consequences that support the problem behavior still remain in place. • The student’s problem behavior continues, even after the replacement behavior has been taught-- because the new, desired behavior is not being adequately reinforced. • Educators working with the student are inconsistent in supporting the new replacement behaviors.

  22. Evaluation. “the process of using information collected through assessment to make decisions or reach conclusions.” (Hosp, 2008; p. 364). Example: A student can be evaluated for ability to ‘comply with teacher requests’ by collecting information using various sources (e.g., direct observation, teacher and student interview, teacher behavior log, Daily Behavior Report Card, etc.), comparing those results to peer norms or developmental expectations and making a decision about whether the student’s current performance is acceptable. Data Collection: Defining Terms Assessment. “the process of collecting information about the characteristics of persons or objects by measuring them. ” (Hosp, 2008; p. 364). Example: The construct ‘complying with teacher requests’ can be assessed using various measurements, including direct observation, teacher and student interview, teacher behavior log, Daily Behavior Report Card, etc. Measurement. “the process of applying numbers to the characteristics of objects or people in a systematic way” (Hosp, 2008; p. 364). Example: Frequency counts can be used to measure the rate of student behaviors that are brief in duration and have a clear onset and end point.

  23. RIOT/ICEL Framework: Organizing Information to Better Identify Student Behavioral & Academic Problems

  24. RIOT/ICEL Framework Sources of Information • Review (of records) • Interview • Observation • Test Focus of Assessment • Instruction • Curriculum • Environment • Learner

  25. RIOT/ICEL Definition • The RIOT/ICEL matrix is an assessment guide to help schools efficiently to decide what relevant information to collect on student academic performance and behavior—and also how to organize that information to identify probable reasons why the student is not experiencing academic or behavioral success.   • The RIOT/ICEL matrix is not itself a data collection instrument. Instead, it is an organizing framework, or heuristic, that increases schools’ confidence both in the quality of the data that they collect and the findings that emerge from the data.

  26. RIOT: Sources of Information • Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test). The top horizontal row of the RIOT/ICEL table includes four potential sources of student information: Review, Interview, Observation, and Test (RIOT). Schools should attempt to collect information from a range of sources to control for potential bias from any one source.

  27. Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Review. This category consists of past or present records collected on the student. Obvious examples include report cards, office disciplinary referral data, state test results, and attendance records. Less obvious examples include student work samples, physical products of teacher interventions (e.g., a sticker chart used to reward positive student behaviors), and emails sent by a teacher to a parent detailing concerns about a student’s study and organizational skills.

  28. Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Interview. Interviews can be conducted face-to-face, via telephone, or even through email correspondence. Interviews can also be structured (that is, using a pre-determined series of questions) or follow an open-ended format, with questions guided by information supplied by the respondent. Interview targets can include those teachers, paraprofessionals, administrators, and support staff in the school setting who have worked with or had interactions with the student in the present or past. Prospective interview candidates can also consist of parents and other relatives of the student as well as the student himself or herself.

  29. Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Observation. Direct observation of the student’s academic skills, study and organizational strategies, degree of attentional focus, and general conduct can be a useful channel of information. Observations can be more structured (e.g., tallying the frequency of call-outs or calculating the percentage of on-task intervals during a class period) or less structured (e.g., observing a student and writing a running narrative of the observed events).

  30. Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Test. Testing can be thought of as a structured and standardized observation of the student that is intended to test certain hypotheses about why the student might be struggling and what school supports would logically benefit the student (Christ, 2008). An example of testing may be a student being administered a math computation CBM probe or an Early Math Fluency probe.

  31. Formal Tests: Only One Source of Student Assessment Information “Tests are often overused and misunderstood in and out of the field of school psychology. When necessary, analog [i.e., test] observations can be used to test relevant hypotheses within controlled conditions. Testing is a highly standardized form of observation. ….The only reason to administer a test is to answer well-specified questions and examine well-specified hypotheses. It is best practice to identify and make explicit the most relevant questions before assessment begins. …The process of assessment should follow these questions. The questions should not follow assessment. “ p.170 Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176). Bethesda, MD: National Association of School Psychologists.

  32. ICEL: Factors Impacting Student Learning • Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner). The leftmost vertical column of the RIO/ICEL table includes four key domains of learning to be assessed: Instruction, Curriculum, Environment, and Learner (ICEL). A common mistake that schools often make is to assume that student learning problems exist primarily in the learner and to underestimate the degree to which teacher instructional strategies, curriculum demands, and environmental influences impact the learner’s academic performance. The ICEL elements ensure that a full range of relevant explanations for student problems are examined.

  33. Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Instruction. The purpose of investigating the ‘instruction’ domain is to uncover any instructional practices that either help the student to learn more effectively or interfere with that student’s learning. More obvious instructional questions to investigate would be whether specific teaching strategies for activating prior knowledge better prepare the student to master new information or whether a student benefits optimally from the large-group lecture format that is often used in a classroom. A less obvious example of an instructional question would be whether a particular student learns better through teacher-delivered or self-directed, computer-administered instruction.

  34. Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Curriculum. ‘Curriculum’ represents the full set of academic skills that a student is expected to have mastered in a specific academic area at a given point in time. To adequately evaluate a student’s acquisition of academic skills, of course, the educator must (1) know the school’s curriculum (and related state academic performance standards), (2) be able to inventory the specific academic skills that the student currently possesses, and then (3) identify gaps between curriculum expectations and actual student skills. (This process of uncovering student academic skill gaps is sometimes referred to as ‘instructional’ or ‘analytic’ assessment.)

  35. Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Environment. The ‘environment’ includes any factors in the student’s school, community, or home surroundings that can directly enable their academic success or hinder that success. Obvious questions about environmental factors that impact learning include whether a student’s educational performance is better or worse in the presence of certain peers and whether having additional adult supervision during a study hall results in higher student work productivity. Less obvious questions about the learning environment include whether a student has a setting at home that is conducive to completing homework or whether chaotic hallway conditions are delaying that student’s transitioning between classes and therefore reducing available learning time.

  36. Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Learner. While the student is at the center of any questions of instruction, curriculum, and [learning] environment, the ‘learner’ domain includes those qualities of the student that represent their unique capacities and traits. More obvious examples of questions that relate to the learner include investigating whether a student has stable and high rates of inattention across different classrooms or evaluating the efficiency of a student’s study habits and test-taking skills. A less obvious example of a question that relates to the learner is whether a student harbors a low sense of self-efficacy in mathematics that is interfering with that learner’s willingness to put appropriate effort into math courses.

  37. The teacher collects several student math computation worksheet samples to document work completion and accuracy. • Data Source: Review • Focus Areas: Curriculum

  38. The student’s parent tells the teacher that her son’s reading grades and attitude toward reading dropped suddenly in Gr 4. • Data Source: Interview • Focus: Curriculum, Learner

  39. An observer monitors the student’s attention on an independent writing assignment—and later analyzes the work’s quality and completeness. • Data Sources: Observation, Review • Focus Areas: Curriculum, Environment, Learner

  40. A student is given a timed math worksheet to complete. She is then given another timed worksheet & offered a reward if she improves. • Data Source: Review, Test • Focus Areas: Curriculum, Learner

  41. Comments from several past report cards describe the student as preferring to socialize rather than work during small-group activities. • Data Source: Review • Focus Areas: Environment

  42. The teacher tallies the number of redirects for an off-task student during discussion. She designs a high-interest lesson, still tracks off-task behavior. • Data Source: Observation, Test • Focus Areas: Instruction

  43. Activity: Use the RIOT/ICEL Framework • Review the RIOT/ICEL matrix. Take the student data that you brought to the workshop and organize it using the matrix. • Identify any areas in the matrix that have only limited information and should be investigated more fully.

  44. Defining Student Problem Behaviors: A Key to Identifying Effective Interventions Jim Wrightwww.interventioncentral.org

  45. Defining Problem Student Behaviors… • Define the problem behavior in clear, observable, measurable terms (Batsche et al., 2008; Upah, 2008). Write a clear description of the problem behavior. Avoid vague problem identification statements such as “The student is disruptive.” A well-written problem definition should include three parts: • Conditions. The condition(s) under which the problem is likely to occur • Problem Description. A specific description of the problem behavior • Contextual information. Information about the frequency, intensity, duration, or other dimension(s) of the behavior that provide a context for estimating the degree to which the behavior presents a problem in the setting(s) in which it occurs.

  46. Defining Student Problem Behaviors: Team Activity • As a team: • Using the data brought on your student: • Step 1: Define the problem behavior in clear, observable, measurable terms. • Five Steps in Understanding & Addressing Problem Behaviors: • Define the problem behavior in clear, observable, measurable terms. • Develop examples and non-examples of the problem behavior. • Write a behavior hypothesis statement. • Select a replacement behavior. • Write a prediction statement.

  47. Defining Problem Student Behaviors… • Develop examples and non-examples of the problem behavior (Upah, 2008). Writing both examples and non-examples of the problem behavior helps to resolve uncertainty about when the student’s conduct should be classified as a problem behavior. Examples should include the most frequent or typical instances of the student problem behavior. Non-examples should include any behaviors that are acceptable conduct but might possibly be confused with the problem behavior.

More Related