200 likes | 299 Views
Using Data for Decisions. Points to Ponder. Different Types of Assessments. Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide Accountability Tests Published Curriculum Tests Teacher made tests General Outcome Measures.
E N D
Using Data for Decisions Points to Ponder
Different Types of Assessments • Measure of Academic Progress (MAP) • Guided Reading (Leveled Reading) • Statewide Accountability Tests • Published Curriculum Tests • Teacher made tests • General Outcome Measures
General Outcome Measurement • General Outcome Measures (a.k.a, Curriculum Based Measures) A simple set of procedures for repeated measurement of student growth toward long-range instructional goals (Deno, 1985)
Examples of Dynamic Indicators in Other Areas • Children’s Health • Thermometer • Height and weight charts • Economy • Consumer Price Index • Dow Jones Industrial Average • Import-Export Balance
GOM characteristics • Standardized • Brief, Efficient, Simple • Valid, Reliable INDICATORS of reading, writing and math • Powerful – proven utility and sensitivity for screening and progress monitoring
General Outcome Measures are DIBS Dynamic Indicators of Basic Skills • Dynamic -Sensitive to change over time • Indicators - Representatives of skills, but do not measure all aspects or applications of the skill domain • Basic Skills - Measures correspond to domains of reading, math, spelling, and written expression.
General Outcome Measurement Testing Schedules ____________________________________ The St. Croix River Education District (SCRED) use GOM’s on two different schedules for different students: Benchmark testing for all students three times per year (Fall, Winter, Spring) Progress Monitoring for students of concern (Monthly, Bi-weekly, weekly)
Benchmark Testing (Screening) Benchmark Testing is used: • for all students in all schools • to monitor progress of all students • to establish school norms • to call attention to students having difficulty • to evaluate instructional programs
Frequent Monitoring We do NOT KNOW ahead of time whether an intervention will be successful for an individual student Do they assume in the hospital that your heart is working just fine after your bypass surgery? After all… the surgery works well for MOST patients…..
Frequent Monitoring ____________________________________ Frequent Monitoring is used : • for students of concern, i.e., students who are below target • to provide a basis for evaluation of instructional programming for individual students as the instruction is occurring • to provide information to help teachers make decisions about goals, materials, levels, and groups • to aid in communication with parents • to document progress for IEP students as is required for periodic and annual reviews
Where to Find Evidence Based Interventions • The What Works Clearinghouse (http://www.w-w-c.org/) • The Promising Practices Network (http://www.promisingpractices.net/) • Blueprints for Violence Prevention (http://www.colorado.edu/cspv/blueprints/index.html) • The International Campbell Collaboration (http://www.campbellcollaboration.org/Fralibrary.html) • Social Programs That Work (http://www.exce.gov.org/displayContent.asp?Keyword+prppcSocial)
Using Data to Inform Practice • Students make more academic progress when teachers regularly collect formative data and use data to guide instruction.
Organization: Supporting Structures Five building-level supporting structures promote Problem Solving and optimal student achievement: ____________________________________ • Continuous Measurement • Grade-level Team Meetings • Flexible Grouping • Grade-level Scheduling • Concentrated Resources
Grade Level Team Meetings ____________________________________ • Teams hold meetings at least monthly • Together teachers view graphs of all students of concern • Teams make decisions about resources • Teams make decisions about interventions • Teachers use Problem Solving process
Problem Solving Teams are Different than Common SIT/SAT/TAT/SST teams • Membership is reflective of all school staff • General and special education represented • Problems are specifically defined in observable measurable terms using technically adequate measurement systems • Efforts made to assess why the problem is occurring • Only reasons that we have the power to do something about are given time for consideration • Observable measurable goals are written for each problem to be addressed, and progress monitoring data is collected and graphed for every goal • Intervention plans are explicitly documented, and intervention integrity is assured through direct observation • Student progress is evaluated based on data
Problem Solving Teams are Different than Common SIT/SAT/TAT/SST teams • Membership is reflective of all school staff • General and special education represented • Problems are specifically defined in observable measurable terms using technically adequate measurement systems • Efforts made to assess why the problem is occurring • Only reasons that we have the power to do something about are given time for consideration • Observable measurable goals are written for each problem to be addressed, and progress monitoring data is collected and graphed for every goal • Intervention plans are explicitly documented, and intervention integrity is assured through direct observation • Student progress is evaluated based on data
Big Ideas: Problem Solving Teams • Teams guided by a problem solving model • Assessment is based on what question is being asked at each step of the model • Assessment is linked to intervention • Teams are well-balanced and not perceived as “special education” hoop-jumping teams.