1 / 66

Nuts and Bolts of Progress Monitoring

Nuts and Bolts of Progress Monitoring. SST3 May, 2014. Overview. Progress Monitoring Assessment: What? Why? Sample PM tools Graphing Data Data-Base Decisions Setting Goals Performance Level Rate of Learning Linking cases to RTI processes. Assessment in a RTI model. Benchmarking

dayton
Download Presentation

Nuts and Bolts of Progress Monitoring

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nuts and Bolts of Progress Monitoring SST3 May, 2014

  2. Overview • Progress Monitoring Assessment: • What? • Why? • Sample PM tools • Graphing Data • Data-Base Decisions • Setting Goals • Performance Level • Rate of Learning • Linking cases to RTI processes

  3. Assessment in a RTI model Benchmarking • To screen and identify students who are at-risk and in need of interventions • All students • Three times a year • All areas • At grade-level Progress Monitoring • To monitor progress of individual students and determine rate of improvement and need for adaptation of intervention • Students who are not achieving benchmarks (PLP, IEP) • Weekly, biweekly, monthly assessments • In area of need • At instructional level

  4. What assessments do you use?

  5. Progress Monitoring Benefits of Progress Monitoring • Parents and students know what is expected • Teachers know what is working or not working with their instruction based on data • Easy to understand way to show parents progress • Teams have comprehensive data on student performance for decision making

  6. Curriculum Based Measures - CBM • Are assessments to monitor progress • Are designed to serve as “indicators” of general achievement. CBMdoesn’t measure everything, but measures the importantthings. • Are standardized teststo be given, scored, and interpreted in a standard way. • Are researchedwith respect to psychometric properties to ensure accurate measures of learning. • Are sensitiveto improvement in short periodsof time. • Are designed as brief measures that can be administered frequently. • Are linked to decision making for promoting positive achievement and Problem-Solving

  7. Tools National Center on Student Progress Monitoring www.studentprogress.org

  8. Samples of CBMs • Reading • Math • Writing • Spelling

  9. MAZE - CBM AIMSweb Reading Comprehension Measure www.aimsweb.com

  10. DIBELSPhoneme Segmentation Fluency https://dibels.uoregon.edu

  11. MATH COMPUTATION Taken from Fuchs, L. S., Hamlett, C. A., & Fuchs, D. (1998). Monitoring Basic Skills Progress: Basic Math Computation (2nd ed.). [computer program]. Austin, TX: ProEd. Available: from http://www.proedinc.com

  12. Concepts and Applications Sample page from a three-page test for Grade 2 Math Concepts and Applications • From Monitoring Basic Skills Progress

  13. CBM - Writing Total Words Written Correct Word Sequences Words Correctly Spelled www.interventioncentral.org

  14. Spelling AIMSweb Spelling Probes

  15. Graphing • Graphing is an essential part of PM • Without graphic displays, the decision making process is difficult • Teacher graphing vs. Student graphing

  16. Graphing Data How to Develop Graphs : VW Beetle vs. SUV vs. Race Car • Hand Graphing • Excel and Chart Dog • Web-based data systems

  17. Hand Graphing • Establish Baseline (Median score) • Set up graph • Set Goal • Draw Aimline • Measure Student Progress • Plot Student Performance • Connect Indicators of Student Performance • Analyze Student Performance • Make Instructional Changes • Continue to Measure and Monitor Student Performance

  18. Hand Graphing

  19. Hand Graphing 50 45 40 Number of Words Read Correctly 35 30 Session 3 Session 4 Session 5 Baseline Session 1 Session 2 Session 6 Session 7 Session 8 Testing Sessions

  20. Hand Graphing Advantages • Easy to do • No technology required • Students can easily maintain their own graphs • Can be done immediately • Free Disadvantages • Added paper • Organization required • No long-term storage • Not automatic

  21. Excel Templates Graphing made easy: Practical tools for school psychologists http://www.oswego.edu/~mcdougal/web_site_4_11_2005/index.html • Academic Monitoring • Behavior Monitoring

  22. Chart Dog www.interventioncentral.org

  23. Excel Templates and Chart Dog Advantage • Automatic • Storage capability built-in • Easy to do • Clear displays of data • Free if you have EXCEL, Chart Dog is free Disadvantage • Requires technology • Time to enter data • Students may not be able to do data entry themselves • Requires some understanding of EXCEL or Chart Dog

  24. Web Based System

  25. Web-Based System Advantage • Web based data entry from anywhere • Storage capability built-in • Trend line drawn automatically • Can annotate graphs interventions/goals • Norms –benchmarks and Rate of Improvement • Lots of flexibility • Email graphs Disadvantage • Requires technology • Cost • Students may not be able to do data entry themselves • Requires some training

  26. To do this will take new learning for everyone

  27. Data-Based Decisions • Performance Level • Gaps in Performance • Below Grade Level • Special Education Significant Discrepancy • Rate of Learning • Trend in performance (slope) • Response to Instruction • General Direction, Rate of Change

  28. Performance Level: Gap/ Discrepancy • Be objective. • Does it refer to an observable/measurable characteristic of behavior? • Use numbers to define the discrepancy. • Percentile rank • Discrepancy Ratios • Cut scores • Norms

  29. Norms… What to use? Local, National • Local norms can be helpful to determine local performance levels and rates of progress • Time consuming and costly to develop • National norms and research norms are available. BUT….

  30. Percentile Ranks 1. Performance Level Requires a Larger Normative Data Base, Preferably Benchmark Data < 25th At Risk, Consider Problem-Solving at the Group Level <10th Potential Severe Problem, Consider Individual Problem Solving

  31. Discrepancy Ratios Performance Level • Sample 5-7 Students or Whole Class, Grade • Figure Median and Graph • Divide by 2 and Graph • Students Who Performance Below the Line May Need Problem Solving

  32. Can Compute… Peer Median Target Student Median 145 40 = Discrepancy of 3.6x

  33. Performance Level Cut Scores • A number which represents the point at which scores can be divided into different groups for decision-making purposes. (E.g. does not meet, meets, and exceeds expectations) • May be based on research (e.g., a correlation between scoring at or above a certain level on a CBM or DIBELS task and future academic success) or expectation (e.g., grades at C or above, no more than 3 office referrals).

  34. Data-base decisions on performance level . • Enables team to make decisions about levels of support and resource from the start Generally speaking… • A student who is 1.5x discrepant from his/her peers may benefit from intensive group interventions. • A student who is 2-2.5x discrepant from his/her peers is appropriate for individualized problem-solving and intensive intervention resources may be appropriate. Example: Jessica is 2.1x discrepant from peers on the Math CBM and may benefit from intensive interventions in math.

  35. Rebecca 2nd grader • List all areas of concern: • Off-task behavior • Reading difficulties • Poor handwriting • Identify primary area of concern and define it in observable and measurable terms: • Reading • Definition: number words read correctly when reading a grade level passage orally • Collect baseline data on primary area of concern and state discrepancy statement: • Baseline data collected in the area of test from CBM reading probes • Discrepancy Statement: Rebecca reads 41 WRC per minute in Fall of 2nd grade while her peers read ____ WRC per minute ___________

  36. Rob 7th grader • List all areas of concern: • Calling out • Lack of homework completion • Poor handwriting • Identify primary area of concern and define it in observable and measurable terms: • Work Completion • Definition: Turning in teacher assigned work at beginning of class period on the day that it is due. • Collect baseline data on primary area of concern and state discrepancy statement: • Baseline data collected in the area of review from teacher grade books • Discrepancy Statement: Rob currently turns in homework 54% of the time while his peers turn in homework 86% of the time (_____ discrepant)

  37. Data-Based Decisions • Performance Level • Gaps in Performance • PLP Not at Grade Level • Special Education Significant Discrepancy • Rate of Learning • Trend in performance (slope) • Response to Instruction • General Direction, Rate of Change

  38. 2. Rate of Learning • Why? • Determine when what we are doing isn’t working and intervene early • Better able to predict student success at meeting goals • Better able to identify who needs more intensive instruction

  39. Rate of Learning Tracking Student Outcomes Using Initial Performance Discrepancies

  40. Rate of Learning • Analyzing Rate using PM Data • Rules: • Setting Goals • Data Point Rules • Trend Line Rules • Slope

  41. Setting Goals • End of the Year Benchmarks • GLEs for Reading Fluency (2nd grade 80-100 WPM, 5th grade 125-150) • AIMSweb Math Computation Norms (1st grade 17 DPM, 5th grade 52 DPM) 2. National Norms for Improvement • Math Calculations (>.3 DPM 2nd and 3rd grade, >.5 DPM 4-6th grade) (Fuchs, 2006) • Reading Fluency (Deno, 2005)

  42. Setting Goals 3. Individual ROI • Weekly rate of improvement in “baseline slope” calculated from 8 data points (Slope: Difference of highest and lowest/#weeks) • Baseline multiplied by 1.5 • Product multiplied by number of weeks until end of year • Add to student’s final baseline score to produce end of year goal. Baseline Reading scores: 52, 54, 52, 53, 55, 58, 55, 56 Difference: 58-52 = 5 Divide by number of weeks: 5/8 =.625 (SLOPE) Baseline multiplied by 1.5: .625 × 1.5 = .9375 Number of weeks left (6 weeks): .9375 ×6 =5.6 Add to final baseline score: 56+5.6 = 61.6 End of the year goal 62

  43. Decisions based on data-points Decisions are based on at least 4 data points • If all 4 scores fall above goal-line, responding to instruction (increase goal if continues for 4 more data points) • If scores are hovering about the goal line, continue what you are doing. • If all 4 scores are below goal-line, but parallel, decide to “wait” for 4 more points to see if student performance accelerates in level to reach original goal. • If all 4 scores fall below goal-line, not responding to instruction, revise plan and implement different teaching strategy. • Mark change on graph with vertical line. Derived from: Fuchs and Fuchs (2006) and Shapiro (2006)

  44. Decisions based on trend lines Trend lines based on 6-8 data-points • If trend line is steeper than goal line, increase the goal. • If trend line is flatter than goal line, revise instruction • If trend line equals goal line, make no change at this time.

  45. Data-base decision on slope Rate of Improvement = slope or r(in statistics) • Y = slopeX + intercept • Consider discrepancy from ROI norms

More Related