520 likes | 789 Views
Data-Based Decision Making. project DATA Assessment Module. Agenda. Project map Data-based Decision Making Stiggins (2006) article Decision Rules Review AIMSweb data Intervention overview Intervention tracker Closing activities Questions?. projectDATA Map. Instructional Survey.
E N D
Data-Based Decision Making projectDATA Assessment Module
Agenda • Project map • Data-based Decision Making • Stiggins (2006) article • Decision Rules • Review AIMSweb data • Intervention overview • Intervention tracker • Closing activities • Questions?
Decisions from Assessments • Demand for varied assessments to support all students as lifelong learners: • Criterion-referenced assessments in addition to norm-referenced assessments • Balance of summative with formative assessments • Balance of large scale and classroom based assessments • Assessment should be linked to a purpose • Varies based on user, questions to answer • No single assessment is capable of meeting the information needs of all of these various users. A productive, multi-level assessment system is needed to be sure that all instructional decisions are informed and made well (p. 12). Stiggins (2006)
Decision Rules for Progress Monitoring • To determine if students are making adequate progress, consider the following… • Is the student improving at the rate expected? • Are interventions needed to support student in reaching goal? • Has student had enough exposure to intervention to demonstrate success? • Should instruction or intervention be modified (i.e., using intervention tracker)?
Graphed Data Rules • Allow for 4-5 data points to orient student to new instructional program • Allow and additional 2-3 data points to examine efficacy of instruction • Intervene after 3-4 data points in a downward or flat trend • Intervention/instructional change should support individual or group need
Using the 3-Point Rule Examine the slope of the trendline and the number of data points above/below aimline.
Data Decision Rules: Adequate Progress The student is exceeding the goal if... Three (3) consecutive data points are above aimline: Consider increasing the goal. Callender & Smith (2006)
Data Decision Rules: Stay the Course! The student is making adequate progress towards the goal if... The data points align with the aimline. Continue with current practice. Callender & Smith (2006)
Data Decision Rules: Inadequate Progress The student may not be making adequate progress towards the goal if... Three (3) consecutive data points are below aimline Intervene to address student needs Callender & Smith (2006)
Things to Consider • Focus on the question: • “Will the individual reach his/her goal by the end of the goal period?” • Decide to change the intervention whenever the rate of progress falls below the expectation • Use the 3-point rule • Changes to goal/instruction are fine tuning rather than major adjustments • Think about alterable variables Callender & Smith (2006)
AIMSweb Login https://aimsweb.edformation.com Enter Customer ID: 10216 Username: Password:
Your name (user name) AIMSweb – Progress Monitor Navigation Tabs Student List Click on link under Progress Report to view student progress graph
AIMSweb – Return to Roster Click ‘Back’ to return to student roster Click ‘PDF’ to generate a printable document
AIMSweb – Progress Monitor Navigation Tabs Student List • To generate a class set of student progress graphs: • Click the box to the right of ‘Progress Report’ to select all students. • Scroll to the bottom of the page, and click ‘View Selected’
Review Class Data • What do you notice about student graphs? • Are students making progress? • Do you see any trends in performance? • Which students might you want to monitor more closely?
AIMSweb – Return to Roster Click ‘Back’ to return to student roster Click ‘PDF’ to generate printable documents. Each student graph and goal/score summary is 2 pages long.
Research on Effective Practices for Teaching Math Explicit, teacher directed instruction Student think alouds Visual and graphic depictions of problems Peer-assisted learning Formative assessment Gersten, Baker, Chard, 2006
Explicit, Systematic Instruction Clear models and demonstrations Range of instructional examples (positive and negative) Extensive and supported practice in newly learned skills and strategies Extensive feedback provided to students (specific positive and corrective)
Example of Explicit, Systematic Instruction: Fractions Clear models and demonstrations: Definition of a fraction: equal parts of a whole.
Example of Explicit, Systematic Instruction: Fractions Clear models and demonstrations: Definition of a fraction: equal parts of a whole.
Example of Explicit, Systematic Instruction: Fractions Range of instructional positive and negative examples (proper, improper and fractions equal 1)
Example of Explicit, Systematic Instruction: Fractions Extensive and supported practice in newly learned skills and strategies and Extensive feedback provided to students (specific positive and corrective) 0 4
Student think-alouds Encouraging students to verbalize their thinking - talk about the steps they used in solving a problem or strategic decisions Verbalizing was most effective when multiple approaches to solving problems were demonstrated and students were encouraged to think-aloud as they solved multiple practice problems.
Example: Student think-alouds Why is it true that 1/2 = 3/6? How would you find the GCF of 6 and 8? Why can’t you add 1/3 and 6/5? What do you have to do so you can add them?
Visual and Graphic Depictionsof Problems Visuals are helpful IF students are provided opportunities to learn to use them and practice using them. Number line and area models for fractions are highly recommended over the “pie” model
Visual and Graphic Depictionsof Problems Concrete-Representational-Abstract (CRA) approach seems promising. Concrete: Making equivalent fractions by folding strips of paper Representational: Making equivalent fractions by segmenting a number line Abstract: Making equivalent fractions by rewriting fractions with a common denominator
Peer Assisted Learning Increased opportunities to practice problem solving and interact with peers about mathematics Results have been consistently positive if… Provided by a proficient, trained peer. Students work in pairs, activities have a clear structure. Pairs include students at differing ability levels. Both students play the role of tutor. Students are trained in to assume the role of tutor.
Formative Assessment to Teachers Superior to typical weekly or biweekly unit tests
Formative Assessment to Students More effective when feedback to students was provided coupled with specific suggestions for intervention strategies (practice problems, alternate ways to explain a concept)
Intervention Tracker • Use intervention tracker to: • Identify intervention logistics • Record interventions for individual students or groups of students • Document instructional decisions made as a result of student progress • Interventions can be documented on AIMSweb graphs
Intervention Tracker Intervention Tracker Procedures: 1. Choose up to 3 students 2. Enter intervention information 2. Monitor intervention using data 4. Review tracker at inservice
Intervention Tracker Intervention Baseline Median Progress Monitoring Start
Closing Activities • New progress monitoring schedule: • Will receive sets of probes at each inservice • Administer one set of probes each week • Mail probes to Lori Wollenweber (at Lane ESD) on Wednesday/Thursday of each week • Use school provided envelopes and UO provided labels • UO will score probes and update AIMSweb • Probes will be returned once each week in your envelopes
Closing Activities • Questions? • Mathematicians Workshop Series • Turn in registration form if you have it • Can also return with weekly probes • Next meeting • February 12 • Benchmark students on EasyCBM • Email Elisa with questions • Bring log-in information if we didn’t set up account • Bring triangle activity from October for comparison • Evaluation