690 likes | 796 Views
Assessing Response to Intervention. A collaborative project between the Florida Department of Education and the University of South Florida. FloridaRtI.usf.edu. Advance Organizer. Team Activity - Review Day Four Intervention Plan Why Monitor Progress? Graphing Conventions Goal Setting
E N D
Assessing Response to Intervention A collaborative project between the Florida Department of Education and the University of South Florida FloridaRtI.usf.edu
Advance Organizer Team Activity - Review Day Four Intervention Plan Why Monitor Progress? Graphing Conventions Goal Setting Interpreting Graphs Decision Making Review of Problem-Solving Steps
Team Activity-Review Day 4 Intervention Plan 1. How did the intervention plan your team wrote on Day 4 differ from intervention plans typically written at your school? How was the plan similar? 2. For which components of the intervention plan did your team provide the most descriptive and specific details? 3. For which components of the intervention plan could your team have provided additional detail to clarify what needed to happen for implementation to occur? 4. How has the writing of intervention plans changed since your team attended the Day 4 training? How have the plans remained the same?
Beliefs Survey Your project ID is: • Last 4 digits of SS# • Last 2 digits of year of birth
Response to Interventionin Context Evaluate Intervention Effectiveness Monitor Progress Analyze the Problem Identify the Problem Implement Intervention Select/Design Intervention J L Timeline
Why Monitor Progress? • Unless we monitor progress, we cannot determine the rate at which the gap is closing. • Continuous feedback improves instructional planning (formative assessment) • Allows earlier decisions about what to do • Increase time, decrease time, revisit problem-solving, etc. • Provides measure of intervention effects -Getting better, staying the same, or getting worse • We don’t know the effectiveness of an intervention until we implement it and monitor progress Randy Allison, 2004
Why Monitor Progress? • Provides clear idea of expectations of performance over time. • Student outcomes improve when performance is assessed regularly • Allows visual comparison to a standard • Data collection provides an objective data base for decision-making • Know if learning is being enabled, and if so, under what conditions • Continuous feedback on performance enhances motivation for many Randy Allison, 2004
Outcome of Monitoring: Diagnose Conditions that Enable Learning • Identify conditions under which student desired learning and behavior is accelerated and you have determined what enables learning. • If conditions are configured like X, Y, or Z, then…what benefits result for the student? . . . . . . . . . . Hi . X Y Z . . . . . . . . . . . . . . . . Lo Time Randy Allison, 2004
Why Not Just UsePre - Post Test Comparisons? • We must measure and demonstrate more than improvement. We must show the rate necessary to attain benchmarks within a time frame. • May be unreliable because of small amount of data collected • May be significant time lag between pre-test and post-test • Not sensitive to small changes in direction of performance in a timely manner • More difficult to analyze patterns of performance Randy Allison, 2004
Basics of CBM - Scientifically Based Progress Monitoring • Research Efforts Led By Stanley Deno, Beginning in 1971 with Federal Funding in 1978 to Provide Viable Progress Monitoring toward IEP Goals • Almost 30 Years of Continuous Research • Mid 1990s Witnessed Move to Standard, High Quality Assessment Materials • A Number of Members of the CBM “Family” including DIBELS M. Shinn
Why CBM…or any Other Measure? • Quick and inexpensive • Linked to instruction and curriculum • Frequently repeatable • Sensitive to small increments of growth • Reliable and valid • Can be used for multiple purposes and to answer different questions Jenkins, Deno, and Mirkin
Academic Measures -CBM- From M.Shinn
Behavioral Measures • Office discipline reports • Behavioral incidents • Suspension/Detention records • Observations • Self-assessments • Surveys • Attendance data • Teacher checklists • Screening instruments • Rating scales
12 12 11 10 9 8 Sam 7 6 5 4 3 2 1 Digits correct per minute 1 2 3 4 5 6 7 8Days
Why use graphs? • Teachers are able to make sound decisions about the instruction being delivered to students based upon data, not guesswork • Parents are kept well informed about their child’s progress with specific information about how their child is responding to instruction. Parents may assist in making suggestions for instructional adjustments. • Students know what is expected of them. They receive specific feedback about their performance along the way rather than only at the end of the marking period. Goal setting and progress monitoring are some of the most effective strategies to improve academic engaged time.
General Outcomes /Specific Skills • The general outcome expectations for students are comprised of many requisite subskills. Both general outcomes and specific skills may be measured. • What subskills would be necessary for: • Long division? • Fluent reading? • Compliance with teacher direction?
Instructional Change Line Goal Aim Line Skill equal increments Trend Line Time - equal increments Graph Components Intervention (Group or Individual) Baseline
2 wks 2 wks Keep increments consistent! Skill Time
2 wks 2 wks Keep increments consistent! In both conditions, the rate of skill acquisition is 1/week. Dissimilar x-axis increments give the impression that learning rate is increased during second time interval Skill Time
2 wks 2 wks Keep increments consistent! Similar x-axis increments reveal true progress over time Skill Time
Goal Setting • Set goals which are ambitious, but reasonable -- rate required to reach goal is 25-50% above typical student rate • Goal has two components Level of performance desired Time within which that level will be attained
Goal Setting • Measure difference between desired and current performance • Divide by number of weeks • Compare to standard to determine reasonable & ambitious growth rate (an increase of 25-50% of typical rate)
Desired - Current Number of Weeks 110 wcm - 60 wcm 20 weeks 50 wcm 20 weeks Goal Setting 2.5 wcm/week 2.0 wcm/week is typical Compare to rate for typical peers: in 25-50% range?
Goal Setting To determine an increase of 25-50%- For lower end of range - multiply typical rate X 1.25 For upper end of range - multiply typical rate X 1.50 Examples: If typical rate = 2.00 words correct per minute/week Ambitious range would be 2.5 - 3.0 words correct per minute/week If typical rate = 3.00 digits correct per minute/week Ambitious range would be 3.75 - 4.50 digits correct per minute/week
2.0 letter sounds correct per minute/week is typical What is ambitious range? Goal is .75 digits correct per minute/week & .5 digits correct per minute/week is typical Is this in ambitious range? Goal is 3.5 words correct per minute/week & 2.0 words correct per minute/week is typical Is this in ambitious range? Goal Setting If:
Realistic growth rates (words/ week) Grade 1 2.00 words Grade 2 1.50 words Grade 3 1.00 words Grade 4 .90 words Grade 5 .50 words Grade 6 .30 words Ambitious growth rates (words/week) Grade 1 3.00 words Grade 2 2.00 words Grade 3 1.50 words Grade 4 1.10 words Grade 5 .80 words Grade 6 .65 words ExampleGrowth Rates Fuchs, Fuchs, 1993
Attend to: • Level • Slope / Rate • Variability
Decision Rules: What is a “Good” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate.
Positive Response to Intervention Expected Performance Performance Observed Performance Fall Winter Spring
Positive Response to Intervention Expected Trajectory Performance Observed Trajectory Time
Decision Rules: What is a “Good” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate.
Questionable Response to Intervention Expected Performance Performance Observed Performance Fall Winter Spring
Questionable Response to Intervention Expected Trajectory Performance Observed Trajectory Time
Decision Rules: What is a “Good” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate.
Poor Response to Intervention Expected Performance Performance Observed Performance Fall Winter Spring
Poor Response to Intervention Expected Trajectory Performance Observed Trajectory Time
Positive Questionable Poor Response to Intervention Expected Trajectory Performance Observed Trajectory Time
DecisionsWhat to do if RtI is: Positive Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence.
DecisionsWhat to do if RtI is: Questionable Was intervention implemented as intended? If no - employ strategies to increase implementation integrity If yes - Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.
DecisionsWhat to do if RtI is: Poor Was intervention implemented as intended? If no - employ strategies in increase implementation integrity If yes - Is intervention aligned with the verified hypothesis? (Intervention Design) Are there other hypotheses to consider? (Problem Analysis) Was the problem identified correctly? (Problem Identification)
Intervention Integrity Decisions Evidence based intervention linked to verified hypothesis planned Evidence based intervention implemented Student Outcomes (SO) Assessed Treatment Integrity (TI) Assessed Continue Intervention +SO +TI Data-based Decisions -SO -TI Implement strategies to promote treatment integrity -SO +TI Modify/change Intervention From Lisa Hagermoser Sanetti, 2008 NASP Convention
Response to Interventionin Context Evaluate Intervention Effectiveness Monitor Progress Analyze the Problem Identify the Problem Implement Intervention Select/Design Intervention J L Timeline
Progress Monitoring Resources Interventioncentral.org Studentprogress.org Dibels.uoregon.edu
School Level Data Review Worksheet Your project ID is: • Last 4 digits of SS# • Last 2 digits of year of birth • Read the case study • Answer six questions using data provided
Review Consensus Infrastructure Implementation