610 likes | 787 Views
Progress Monitoring. No Data....No Answers. Of course we also already have:. Classroom assessments End of unit tests Homework completion info Attendance records Discipline referrals. We have tons of data – but that doesn’t mean we have information that informs.
E N D
Of course we also already have: • Classroom assessments • End of unit tests • Homework completion info • Attendance records • Discipline referrals
We have tons of data – but that doesn’t mean we have information that informs. • Much of our data is not scientifically based, and cannot be compared
Testing vs Measurement Both assess but difference is based on purpose • Testing judges • summative • Measurement informs • formative
Data That Judges vs Data That Informs A A Administration Owner Parents/ Community Fans School Basketball Team B A A B Teacher Coach Student Players DATA Set A Grades DATA Set “A” Won/Lost Record DATA Set B DIBELS DATA Set “B” Team/Individual Statistics Data that Judges Data that Informs
Features of Effective Instruction • Use data that INFORMS for: • Grouping • Planning instruction • Delivering targeted instruction and intervention to address students’ instructional needs • Monitoring student progress toward grade-level standards/benchmarks
What makes it a Core/Basic Skill? • Predictive of later achievement • Something we can do something about…we can teach it • Something that improves outcomes for students if we teach it
Fluency with Connected Text (Spring, 3rd) Fluency with Connected Text (Spring, 1st) Fluency with Connected Text (Spring, 2nd) Alphabetic Principle (Winter, 1st) Steps for Successful Readers (Roland Good) Probability: On-Track .81 (n=196) We need to have the odds with us! Probability: On-Track .83 (n=246) Probability: Catch-Up .06 (n=213) Probability: On-Track .86 (n=138) Probability: Catch-Up .03 (n=114) Probability: On-Track .64 (n=348) Probability: Catch-Up .22 (n=180) Phonemic Awareness (Spring, Kdg) Probability: Catch-Up .17 (n=183) Probability of remaining an average reader in fourth grade when an average reader in first grade is .87 Probability of remaining a poor reader at the end of fourth grade when a poor reader at the end of first grade is .88 (Juel, 1988)
For Data To Be Useful • Assessment must be • Reliable • Valid • Efficient
DIBELS Oral Reading • Student reads aloud for 1 minute from each of 3 separate reading passages • While student reads, examiner marks errors • Calculate the number of correctly-read words (CRW) per minute and number of errors • Median score is used as the student’s reading rate. • (there are also pre-reading measures)
DIBELS is used for: • To identify at-risk students who may need additional services • To help teachers plan more effective instruction within their classrooms • To help teachers design more effective instructional programs for students who don’t respond to the general education program • To document student progress for accountability purposes • To communicate with parents or others professionals about students’ progress
DIBELS • Current levels of performance is measured • Goals are identified • Progress is measured on a regular basis (weekly or monthly). Compare expected versus actual rates of learning. • Based on these measurements, teaching is adjusted as needed.
Taking it a step further • Using assessment to develop interventions • Survey Assessments • Teaching students to use it for Peer Assisted Learning • and more…
Top-Down Processing School-Wide and Grade-level team level 1st Phase Classroom or Special Group level 2nd Phase Individual student level 3rd Phase
Data-Driven InstructionalDecision-Making • Involves using assessment data to determine your school’s current status: • What’s working • What’s not working • How did different sub-groups (economically disadvantaged, racial and ethnic groups, students with disabilities or with limited English proficiency) score? • What actions are needed to improve classroom instruction and student outcomes?
Do you know where you’ve been? Do you know where you’re going?
Grade Level Analysis • This should be accomplished through grade level meetings • Teachers and staff need to time to look at the data and make decisions • Helps to have a facilitator and an agenda • Focus on the data
Questions to Ask: What percentage of students will be at benchmark at the next school-wide assessment? What will you do to be sure all students’ are instructed at their level?
2nd Grade Mid Year 2006-2007 71% = Low Risk (31 students) 13% = Some Risk (7 students) 15% = At Risk (8 students)
Grouping Form Classroom #1 Classroom #2 Intensive Intensive Strategic Strategic Randy 8 Josh 10 Paul 11 Marsha 30 Carrie 30 Joey 31 Ross 49 Betsy 50 NONE! Lizzy 54 Travis 55 Mandy 59 Greg 64 Henry 64 Jarod 65 Nakia 67 Benchmark Benchmark 21 Students – Scores Ranging from 74 to 152 Words Read Correct 16 Students – Scores Ranging from 74 to 176 Words Read Correctly
Teachers determine needs • Interventions are chosen • Additional problem-solving happens as needed ** Remember – We’re looking across the grade-level. How can we combine kids and combine our effectiveness
2nd Grade Problem Solving • Benchmark – Core Program • Strategic and Higher Level Intensive Students • Teacher Directed Pals • Read Naturally • Intensive Students • SIPPS with 1st grade students • Reading Mastery • Read Well
2nd Grade Problem Solving • Classroom #2 • Additional Paraprofessional Time • Additional Behavior Intervention time from Social Worker
Need to Watch the Progress • Teachers discuss at monthly grade-level meetings what is working and what is not • Return to the data after each benchmarking and make decisions
Middle of the Year Data Beginning of the Year Data Benchmark: 38% (n=22) Some Risk: 22% (n=13) At Risk: 40% (n=23) Benchmark: 56% (n=30) Some Risk: 26% (n=15) At Risk: 22% (n=13)
Top-Down Processing School-Wide and Grade-level team level 1st Phase Classroom or Special Group level 2nd Phase Individual student level 3rd Phase
Classroom or Special Group Analysis • Are there certain student groups that are not making progress? • Is there a certain Tier or a certain population that are not making gains?
Intensive Individual Interventions • Individual Students • Assessment Based • High-Intensity • Of longer duration • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Programming • All Students • Preventative • Pro-active Individual Student Level Intervention Progress Monitoring • Intensive • Bi-Weekly • Strategic • Monthly • Benchmarking • 3 Times a Year
Vocabulary and Language Development Reading Comprehension Accuracy and Fluency with Connected Text Phonological Awareness Alphabetic Principle Initial Sound Fluency Phoneme Segment. Fluency Nonsense Word Fluency Oral Reading Fluency ORF, Teacher-Made Assessments/ Observations
What kind of progress can we really expect? Can we do even better than this?
How? • Use research based programs • Make ambitious goals • Track progress • Make changes when needed
Benefits of Progress Monitoring • Clear visual representation of progress. • Common understanding between teachers, parents, psychologists, administrators. • Student can track/follow their own progress ** Increases Communication ** • Evaluates success of programs
Best Practice • Monitor Students at Grade Level as Often as Possible • Out-of-Grade Monitor When You Need To Have: • Better Information For Decision Making
General Guidelines • Students should be at about the 20th percentile to be monitored at that level • Once students begin to reach goal for that grade level, move up • When moving up to the next level, get 2-3 data points at both levels so you can continue to watch the student’s trend.
http://brt.uoregon.edu/techreports/ORF_90Yrs_Intro_TechRpt33.pdfhttp://brt.uoregon.edu/techreports/ORF_90Yrs_Intro_TechRpt33.pdf